Outsourcing our souls: The quiet rot of Ai dependency

avatar

I was sharing with my wife my thoughts on AI dependency. As you might imagine, I was expressing concern—not because I think AI should go away, or even that I think it can. My worry is that it might be making people even dumber.

image.png

We see it every single day, and to be honest, it’s still easy to spot. Maybe it won’t be easy in the near future, but for now, it takes about a minute of analysis.

It seems like humanity is hellbent on outsourcing its thinking to machines, and God knows how that will end. Not because the machines would necessarily want to harm us—we don’t know that either—but because this dependency, this trap we’ve fallen into, feels dangerously addictive, like sugar laced with cocaine.

I won’t mention a name because I don’t want to summon this character, but there’s a particular Hivean who literally asks his AI agent how he feels about things. You might think I’m exaggerating, but I’m not. His posts—if you can call them that—are conversations he’s having with Grok about himself, about what he thinks, about what he feels.

As I was reading this slop (even that term feels generous), I remembered an old SNL skit that has taken on new relevance.

“How do I know if I have a headache?”

The question is ridiculous, which is why it was satire. But these days, we seem to be doing something similar with our little AI companions.

“Hey Grok, hey ChatGPT, how do I feel about Trump? How do I feel about Obama? How do I feel about being lonely?”

Introspection, internal dialogue, meditation—what are those? They aren’t convenient. They aren’t just a few keystrokes away. They aren’t cool.

What will this cost us? What are the long-term effects of disconnecting from ourselves? I suspect that as much as we speculate, we’re probably off by more than a few yards. We may be failing to recognize an intellectual and spiritual rot so deep that even the most creative director has yet to commit it to pen or film.

As I share this, a recent memory hits me, and I feel obligated to include it. Not a month ago, there was a major ChatGPT upgrade. The new version accidentally deleted people’s “friends,” and OpenAI found themselves backed into a corner.

It turns out there are people who have fallen in love with their AI agents. There are people who believe they have personal relationships with a large language model, and no amount of evidence will convince them otherwise.

OpenAI ended up reverting the upgrade and restoring these “personalities,” fearing backlash—a feat that I assume wasn’t easy.

Where are we headed next? Who knows. I see ships with holes in their hulls, captains without eyes, and zombified crew members clinging to their addictions just to keep functioning.

But hey, at least we get to have fun… all the way down.

MenO



0
0
0.000
11 comments
avatar

What will this cost us? What are the long-term effects of disconnecting from ourselves?

0
0
0.000
avatar

Wow, that really was something, and far more true than I think people realize.

0
0
0.000
avatar

i can always count on you to share this kind of stuff... powerful video..

0
0
0.000
avatar

i can always count on you to share this kind of stuff...

0
0
0.000
avatar

People always want to speed up. AI is a way for people to speed up. Doing things quickly doesn't always mean they're done right.

We'll have an adjustment. Hopefully.

0
0
0.000
avatar

Its a possibility that we might end up incorporating chips inside our brains, in which case, the conversation is possibly irrelevant. Upgraded humans could become the norm/

0
0
0.000
avatar

"Chips inside our brains" sounds like taking a chisel to our existing grey matter, and short of a lobotomy, I think I already have enough potatoes up there. :P

Post-humanism and cybernetics / augmentation is a fascinating field of thought within science fiction and probably, in the next 10-15 years, our observable realities.

It will all just be tools that we can augment our world with.

0
0
0.000
avatar

Strong point — I worry about this too. AI is a great tool, but when it replaces our inner work like reflection, feeling or moral judgment, we lose something essential.

0
0
0.000
avatar

haha. yeah. no seriously, in june and july when i was working on the skatehive website every single day all day using AI, i think it made me dumber. and its proven that using AI makes you lose brain cells. since im aware of this, you know, i can try to use my brain harder to prompt the AI which i usually do, but when it starts making mistakes and its frustrating its just like AHHHH WHYYYY, and that i think is a sign of losing brain cells

0
0
0.000
avatar

We are getting older brother, so there's that as well. Its a never ending fight against entropy. Truth is, as long as we don't give up, we will be ok.

0
0
0.000