Cognitive Debt: The Hidden Cost of Using AI to Think for Us

This article explores how heavy reliance on AI tools like ChatGPT may reshape human thinking. A recent MIT study with 54 participants found that AI-assisted writing lowered brain connectivity, reduced recall, weakened ownership of ideas, and produced more homogeneous language. The researchers call this effect “cognitive debt”, a buildup of mental costs from outsourcing too much thinking to machines. While AI delivers real benefits in transformation projects and productivity, overuse risks eroding critical human skills such as memory, authentic communication, and conflict resolution. Drawing on Simon Sinek’s warning that “AI is a great boat—but you still need to learn to swim,” the article argues for mindful, sovereign use of AI. Recommendations include balancing AI and non-AI work, reading deeply, engaging in real debates, and even hosting personal AI systems to retain control. The core message: embrace AI’s power, but don’t abandon the human “muscles” that make us resilient, authentic, and independent.
Cognitive Debt: The Hidden Cost of Using AI to Think for Us

Remembering Phone Numbers, Forgetting Our Words: What AI Might Be Doing to Our Minds

We’ve already outsourced memory to our phones. A new MIT-led study suggests that heavy reliance on AI may also outsource thinking, lowering engagement, recall, and ownership of our words. The challenge isn’t rejecting AI but learning to use it wisely, keeping our human skills alive.

Remember the time when we could recall dozens of phone numbers? I still remember calling friends and family from public phone booths, numbers etched in memory without a second thought. Fast forward 30 years of smartphones, and I can barely remember my own number. We’ve outsourced memory to technology, and it has quietly changed how we think.

A recent MIT-led study (https://arxiv.org/pdf/2506.08872) makes me wonder: what if our use of AI tools today is doing something similar, not just with numbers but with our ability to think, write, and own our words? I’m less worried for my generation, but what about the younger ones growing up with AI as their default assistant?

What the study found

The research followed 54 participants over three primary sessions (and 18 of them in a fourth crossover session), divided into three conditions: no tools (Brain-only), Search Engine, and ChatGPT (LLM use). EEG data, essay text analyzed by both humans and algorithms, and recall tasks were all used to compare outcomes.

  • Neural connectivity falls the more external tools are used. Brain-only participants showed the most widespread and strongest EEG connectivity; Search Engine users were in the middle; LLM users showed the weakest connectivity patterns.

  • In the crossover Session 4, those who switched from LLM use back to Brain-only still exhibited reduced alpha and beta connectivity, suggesting that heavy dependence on AI may leave lingering effects. Conversely, those going from Brain-only to LLM regained higher recall and activation in some neural circuits.

  • Ownership and memory suffer with LLM assistance. The ChatGPT group reported the weakest sense of owning their writing, and many participants (about 83%) could not accurately quote from their own essays shortly after writing.

  • Essays by LLM users became more homogeneous. Linguistic and topic analyses (n-grams, named entities, topic ontology) showed that LLM-assisted essays clustered more tightly together, less variation than in the Search Engine or Brain-only groups.

The authors term all this “cognitive debt.” It doesn’t imply damage, rather, the accumulation of mental cost: weaker recall, thinner ownership of ideas, shallower engagement.

I’m not a neuroscientist, but I tend to see the brain as a kind of muscle. And like any other muscle, if we stop using it, we start losing it. The study’s findings remind me that convenience has a cost: when AI takes over too much of the heavy lifting, we risk letting core cognitive abilities atrophy from lack of use.

Why this matters

We’ve already seen what outsourcing memory to smartphones did: we gained convenience but lost recall. Now, outsourcing thought and composition to AI could shape how deeply we engage with ideas. Over time, this matters for how we learn, collaborate, and even how we lead.

  • Losing “unscripted” social skills. If we outsource apologies, tough talks, or negotiations to AI, we risk dulling our ability to interact, discuss, and argue without assistance. Real relationships are built by repairing mistakes, not by delivering perfect, AI‑composed lines.

  • Journey versus destination. AI optimizes outputs, the perfect email, pitch, or essay. But growth comes from the struggle: drafting, debating, and resolving conflict. Skipping the work can mean skipping the learning.

  • Boats versus swimming. AI is a great boat, fast, reliable, and helpful. But in a storm you still need to swim. Listening, improvising, showing empathy, these human muscles atrophy if we never use them.

  • Authenticity is a premium. The internet is filling with flawless, cookie‑cutter prose. People discount messages that don’t sound like you. Imperfections, your phrasing, your cadence, even your typos, signal trust and authenticity.

  • Human skills at stake. Listening, giving and receiving feedback, resolving conflict peacefully, and showing empathy are muscles that only grow through practice. If we let AI handle the difficult parts, we may keep the result but lose the resilience.

  • Societal context. In a world already marked by loneliness and disconnection, doubling down on tools that replace human contact can amplify the problem. Technology should augment our humanity, not anesthetize it.

We’ve already seen what outsourcing memory to smartphones did: we gained convenience but lost recall. Now, outsourcing thought and composition to AI could shape how deeply we engage with ideas. Over time, this matters for how we learn, collaborate, and even how we lead.

“AI is a great boat, but you still need to learn to swim.” Simon Sinek

Simon Sinek, in his conversation on The Diary of a CEO ([https://www.youtube.com/watch?v=W4tqbEmplug*](https://www.youtube.com/watch?v=W4tqbEmplug)*), warned about a similar risk: by letting AI handle our apologies, negotiations, or tough conversations, we may lose the ability to interact authentically without assistance. He described it as the difference between owning a boat and knowing how to swim. AI can help us get across the water faster, but when the storm comes, only those who practiced swimming will survive. This echoes the MIT study’s concern, convenience today can mean fragility tomorrow.

How to use AI without losing yourself

Start each project with a short “cold open”: five minutes of solo writing to capture your ideas. Then, use AI as a critic, not as the author, let it challenge your structure, highlight blind spots, or polish grammar. Before finalizing, step away and try to recall two sentences from your draft; if you can’t, it means the text isn’t yet yours. Add your lived voice, stories, numbers, and examples to keep the human fingerprints. Alternate between AI‑heavy and AI‑light days to balance efficiency with mental strength. And once a week, schedule a no‑AI writing sprint to preserve your ability to think and create unaided.

Through my sovereignty lens, it also means strengthening independence. Read books deeply to sharpen your mind, rather than skimming quick feeds. Engage in real debates with people, not just simulations with machines, to train your critical thinking. And when you do use AI, consider hosting your own models or services, so you remain in control of your data and your tools. These choices ensure that AI serves your growth and sovereignty, not the other way around.

My concern

I’m not afraid of AI replacing us. In fact, I actively implement AI and agentic systems in my transformation projects and see many truly valuable use cases. What concerns me more is the quiet erosion of our own thinking through AI shortcuts. Just like I lost the habit of remembering phone numbers, younger generations might lose the habit of remembering their own words.

But my concern goes deeper. I believe in sovereignty, over money, over technology, and over our minds. Outsourcing too much to AI doesn’t just reduce recall; it risks eroding our independence. Sovereignty is about being able to stand on your own two feet when the system falters. If AI gives us boats, that’s helpful. But when the storm comes and the boats fail, we still need to know how to swim. Human skills, listening, debating, leading, comforting, are part of our sovereignty. If we forget them, we hand over more than convenience. We hand over ourselves.

That’s why I see deliberate, mindful use of AI as essential. Not to reject it, but to make sure it serves our humanity and not the other way around.

Question for you: How do you make sure your words still feel like your own when using AI?

If this resonates, follow me for more reflections on tech, sovereignty, and the habits that keep us human. And stay tuned for my upcoming book Brick by Brick: Building a Sovereign Life on Bitcoin, available for pre‑order on 31 October at twentyone.life/brick-by-brick.


No comments yet.