Lately, I’ve been wondering: is ChatGPT making me lazy—or worse, dumb?
Take this: I sit down to write an English essay. Half of my brain is yelling, “Just write the essay!” The other half is whispering, “Well, maybe just ask ChatGPT for a few ideas, just to get started…” Most of the time, the rational side wins. But still, I’m using AI more and more—breaking down complicated articles, generating story ideas, rewriting awkward sentences. And sometimes, it feels like I’m outsourcing my brain.
Unfortunately, that might not be just a feeling.
The Study
A group of researchers at MIT recently ran a study with 54 participants, ages 18 to 39. They split them into four groups: one used ChatGPT, another used OpenAI’s other tools, one used Google, and the last had no tech help at all—just their own brains. Over a few months, each group wrote multiple SAT-style essays (traumatic flashbacks, anyone?). The results? The ChatGPT group showed the lowest brain activity. Many ended up copy-pasting responses and later couldn’t recall what they’d written. They didn’t even feel like the essays were theirs.
Then, in the final round, the groups swapped roles—AI users went analog and vice versa. Unsurprisingly, the people who had been thinking for themselves all along wrote stronger essays. MIT called the decline among AI users “cognitive debt”—basically, the more you rely on AI, the less you flex your own mental muscles. And over time, that takes a toll.
So… does that mean we should all throw our phones in a lake and throw a fiesta?
Not quite.
While this study sounds alarming, it’s just one piece of a bigger picture. Some argue that the ChatGPT group didn’t get dumber—the others just got smarter. The non-AI group had been writing the whole time. Of course they improved. Meanwhile, the AI group got used to taking shortcuts, and when the tools were taken away, they were rusty.
Meaning: it may not be necessarily true that AI users took on “cognitive debt.”
What’s the real takeaway?
It’s not that ChatGPT is evil. It’s how we use it that matters. If we treat AI like a calculator for ideas—something to support, not replace, our thinking—then maybe we won’t go full zombie mode. Maybe the key is learning to choose when to rely on AI, and when to sit with the hard stuff ourselves.
In other words, we still have to learn how to think.