ChatGPT Use Linked to Memory Loss and Shallow Thinking in MIT Study
A study from MIT shows heavy reliance on ChatGPT can weaken memory and critical thinking in writers. Balancing AI use with active engagement is key to retaining ideas.

AI Slop: How ChatGPT May Be Hindering Your Brain Gains
A new study from MIT reveals a concerning downside to relying heavily on AI tools like ChatGPT for critical thinking tasks. If you’re a writer, this is worth paying attention to.
The Study Setup
Participants were split into three groups while writing essays. One group used ChatGPT throughout the process, another used a search engine, and the last group wrote with no digital help—just their own minds. Later, the AI group edited their essays without AI, while the brains-only group used AI tools for editing. All were then asked to recall specific parts of their writing.
Key Findings
The group that depended on ChatGPT struggled to accurately recall details from their essays. According to the study, “Dependence on LLM tools appeared to have impaired long-term semantic retention and contextual memory, limiting [the] ability to reconstruct content without assistance.” Simply put, relying on AI led to shallow engagement with the material and weaker memory of what was written.
This phenomenon is referred to as cognitive debt: repeated reliance on AI replaces the mental effort needed for independent thinking. Instead of carefully selecting and analyzing information, users offload that responsibility to AI, which can result in more biased and superficial writing.
What Writers Should Take Away
- Memory and Understanding Matter: Using AI to generate or heavily assist with content might reduce your ability to retain and understand the material deeply.
- Critical Engagement is Key: Writing isn’t just about putting words on a page; it’s about thinking through ideas and making decisions on what to include.
- Balance is Crucial: Some students find AI helpful when used within their own knowledge boundaries, but unchecked use risks dependency.
Voices from Students
A student from Emily Carr shared, “I use LLMs a lot, every day. LLMs are most useful (and safe) when they’re used within the boundaries of your own expertise.”
Conversely, Saba Amrei, a fourth-year student at Capilano University, believes, “I don’t think ChatGPT or AI should be allowed in school or university. The purpose [of education] is to have students do the work and learn the process.”
Luke Hopkinson, a first-year student at the University of British Columbia, refuses AI use for reasons beyond reliability: “AI is trained upon the history of English literature and internet content, meaning that it carries all of the inherent biases.”
Why This Matters for Writers
Whether you’re drafting blog posts, articles, or creative work, relying too much on AI risks losing ownership of your ideas. The study warns that blindly accepting AI-generated suggestions can lead to “internalizing shallow or biased perspectives.”
For writers, this means your unique voice and critical viewpoint might get diluted if you don’t actively engage with the content you produce. The short-term convenience of AI could come at the cost of long-term skill development.
While the study is preliminary and involved a limited number of participants, it raises important questions about how AI tools impact thinking and memory. Writers should weigh these findings carefully before leaning too heavily on AI assistance.
For those interested in learning how to use AI thoughtfully and effectively, consider exploring courses on ChatGPT and AI writing tools designed to help balance AI use with critical engagement.