AI in Classrooms: Brookings Finds Lost Learning, Frayed Trust, and Ways to Fix It

A global survey urges schools to slow down on AI: for K-12, the risks outweigh the gains. Focus on thinking, relationships, and transparency while setting real guardrails.

Categorized in: AI News Education
Published on: Jan 21, 2026
AI in Classrooms: Brookings Finds Lost Learning, Frayed Trust, and Ways to Fix It

Artificial Intelligence: Four Takeaways From a New Report on AI's Risks in Education

AI is spreading across every industry, but a new Brookings analysis of students, parents, and teachers in 50 countries says schools should hit pause. The headline: the risks for K-12 learners outweigh the benefits right now. Productivity gains don't translate into growth for kids whose minds are still developing, and the downsides compound fast.

Researchers also note a hard truth: we lack rigorous, long-term evidence on student learning and well-being. No one - including AI's creators - can forecast the full range of outcomes with certainty. That's exactly why educators need clear principles and practical guardrails.

1) Foundational learning is getting hollowed out

Adults with fully developed metacognition can use AI as a "cognitive partner." Students usually don't. For many kids, AI becomes a surrogate - a shortcut that replaces struggle, practice, and depth with quick answers.

The result is cognitive offloading: less critical thinking, less sustained engagement, and declining skills. If students can submit AI-generated work and get credit, their incentive to learn drops - and so does their capacity to explain, argue, and create on their own.

2) Social and emotional growth is at risk

Learning happens in relationships. Yet students are leaning on chatbots for homework help, emotional support, and even companionship. Many teachers see warning signs, even if few students report harm outright.

Here's the concern: kids may not recognize emotional dependence or how it affects well-being. Over time, heavy use of AI companions can blunt core human skills - reading cues, resolving conflict, and building resilience after setbacks.

3) Trust between students and teachers is eroding

Teachers doubt the authenticity of student work. Students doubt teachers who quietly use AI to create lessons and assignments. That mutual suspicion fractures classroom culture.

Researchers found a measurable dip in trust, with a notable share of teachers calling it a serious issue. If this continues, it won't just strain classrooms - it could weaken trust in schools as institutions.

4) The damage is real - and fixable

The report is clear: we don't have to accept these harms. With deliberate design and transparency, schools can protect learning while using AI where it actually helps.

What to do next (practical steps you can deploy)

  • Redesign assignments away from transactional tasks. Prioritize reasoning, process evidence (notes, drafts, thinking steps), oral defenses, in-class creation, and frequent low-stakes checks.
  • Adopt "teach, not tell" AI. Prefer tools that question, scaffold, and explain over tools that spit out final answers. For example: "Explain this paragraph in a different way," using vetted sources and citations.
  • Co-create with your community. Form student AI councils, involve parents, and pilot tools with clear feedback loops before wide adoption.
  • Set shared norms for transparency. Students disclose AI assistance used and how it helped; teachers disclose where AI supported planning. Make trust the default.
  • Protect SEL. Set boundaries on AI companions; route sensitive support to humans first. Build in more peer-to-peer work, mentorship, and real conversation.
  • Address equity head-on. Free tools are often less accurate. Either budget for reliable access or provide structured alternatives so accuracy isn't paywalled.
  • Teach AI literacy. Cover capabilities, limits, bias, privacy, and evaluation of outputs. Invest in sustained PD so teachers can model smart use and critical thinking.
  • Collect evidence locally. Run small experiments, track learning and well-being, and share results across departments and schools.

Useful references

For context on the research and global recommendations, see the work from the Brookings Center for Universal Education.

For educator training at scale, the American Federation of Teachers launched the National Academy for AI Instruction, with a plan to train 400,000 educators over five years. Learn more via the AFT.

If you're building AI literacy and PD

Looking to curate coursework for your staff or role-specific upskilling, explore vetted options here: AI courses by job.

The bottom line

AI can expand access to content, but it can also widen gaps because the most accurate tools often sit behind paywalls. More importantly, easy answers can trade short-term productivity for long-term growth.

Design for thinking, protect human connection, and be transparent. Do that consistently, and you can gain the benefits without sacrificing the very skills school exists to build.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide