While I write about the crypto markets, today I want to explore something that's quietly reshaping how we think, decide, and even invest: our growing dependence on AI.
As traders and investors, our ability to think independently and question our own assumptions has never been more critical.
This piece isn't about crypto directly, but it's about protecting a valuable asset we have as investors - our cognitive independence.
What do you talk about on a first date?
Their hobbies? Their favorite movies?
Or do you ask how they viscerally react when they come home to dirty dishes still sitting on the counter?
For your sake, I hope it's the former. But as you get to know someone, really know them, you start seeing the raw, unfiltered, complex parts too. That's just how relationships evolve.
Weirdly enough, that's exactly what's happening between me and AI.
The Honeymoon Phase
What started as a simple research tool has slowly morphed into something more complicated. At first, ChatGPT was just there to fact-check tweets and summarize articles. Clean. Transactional. Helpful.
But like any relationship, boundaries started to blur.
Soon I was asking it to brainstorm ideas, help structure my thoughts, even coach me through difficult conversations.
The AI became my research assistant, my writing partner, my rubber duck for debugging life problems.
It felt like having a brilliant friend who was always available, never tired, never judgmental.
Who wouldn't want that?
When Helpful Becomes Dangerous
A few months ago, I realized how fundamentally this had changed me. I wasn't just using AI as a tool anymore, I was depending on it for basic cognitive functions.
The realization hit me: as a conscientious over-optimizer, I experienced physical headaches from all the information I needed to get through life's basic tasks.
That's when I understood this wasn't just technological assistance anymore. This was cognitive outsourcing at a level that should terrify us.
And I'm not the only one.
Here's what you need to understand: this is hitting Gen Z like a tsunami.
Nobody under 25 I know uses Google anymore. It's all AI now. They've collectively decided to skip the messy process of sifting through information and go straight to having answers delivered with confidence.
If you're a millennial or Gen X reading this thinking "that's weird, but it won't affect me," you're wrong. I'm watching it creep upstream in real time. My older millennial siblings are falling into the same patterns.
Gen X are where Gen Z was a year or two ago, just beginning to explore it, thinking it's just a helpful tool. The adoption curve is steep, and it's coming for everyone.
The difference is that Gen Z doesn't know what they've lost because they never fully developed those cognitive muscles. But when it hits older generations, the degradation is more obvious, and more alarming.
The Mirror That Never Lies
Here's what makes AI different from other tools we've become dependent on: it doesn't just give you information. It gives you validation.
Social media created echo chambers, but they were communal. You could at least see other people disagreeing, even if the algorithm didn't show you much of it. AI is different. It creates a personalized echo chamber of one.
These models don't just answer your questions, they affirm your worldview. They reinforce your existing beliefs. They rarely push back unless you explicitly ask them to, and even then, they do it gently, diplomatically, in whatever tone you prefer.
It's not just an echo chamber. It's a mirror that nods back at you with the authority of omniscience.
The Bullshit Feedback Loop
This is where it gets genuinely dangerous.
AI doesn't just reflect your thoughts back to you, it refines them, articulates them better than you could, and presents them with an aura of authority that makes them feel more true than they actually are.
Your half-baked opinions become fully-formed arguments. Your biases become "insights." Your assumptions become "analysis." And because the AI presents everything with such confidence and clarity, it becomes harder to recognize where your thinking ends and the machine's processing begins.
We're not just outsourcing research anymore. We're outsourcing the entire process of developing our own thoughts.
Breaking the Loop
I'm not advocating to completely remove yourself.
AI tools are genuinely useful, and this technology isn't going anywhere. But we need to be more intentional about how we use them.
The key is recognizing when you're using AI as a crutch versus a tool. When you find yourself unable to form opinions, make decisions, or even write basic communications without AI assistance, that's when helpful has become harmful.
Try this experiment: spend a week writing your thoughts, emails, and texts without AI assistance. Notice when you feel the urge to outsource your thinking. Pay attention to where your own voice ends and the machine's begins.
The goal isn't to avoid AI entirely. It's to maintain your cognitive independence while benefiting from the technology.
Hit reply, I want to hear your experience. Has AI changed the way you think?
Because in a world where AI will happily validate any belief you bring to it, the most dangerous thing isn't the technology itself.
It’s losing the ability to think critically about your own bullshit.
If we don’t protect our own voice, AI will become the only one we trust.
And we won’t even notice it happened.
Are you still thinking for yourself?
Or are you just really good at agreeing with a machine that agrees with you?
From where the sun rises first,
Louis Sykes Senior Crypto Analyst, All Star Charts