AI in classrooms: bend the arc toward better learning
A new study from the Center for Universal Education at the Brookings Institution says the current path of generative AI in education puts students at more risk than benefit. The key message: it's not too late to change course and use AI to enrich, not diminish, learning.
The report-"A new direction for students in an AI world: Prosper, Prepare, Protect"-offers a practical framework for schools, systems, governments, companies, and families. It's a call to act now, not in a decade, with clear steps any education leader can start this term.
Source: Brookings Center for Universal Education
What the study looked at
The team ran a year-long global study: 500+ students, teachers, parents, leaders, and technologists across 50 countries, plus a review of 400+ studies. Their conclusion is blunt: on the current trajectory, risks overshadow benefits for children's learning and development.
The issue isn't that AI can't help. It's that the risks hit foundational skills-thinking, motivation, relationships, and safety-making the promised gains harder to realize if we don't change how we deploy these tools.
Two paths for AI in learning
- AI-enriched learning: When used within sound pedagogy, AI can give feedback, differentiate practice, support language learning, and free teachers to focus on high-value instruction.
- AI-diminished learning: Overreliance can erode critical thinking, intrinsic motivation, and trust. It can also expose students to privacy and safety risks if oversight is weak.
Prosper, Prepare, Protect: a framework for action
Prosper: Use AI to improve learning-on purpose
- Adopt AI only when it clearly improves learning outcomes or teacher capacity. Set success criteria in advance.
- Prioritize tools that support formative feedback, scaffolding, and practice-not shortcuts to finished work.
- Pilot in small cohorts, gather evidence, and scale what works. Sunset what doesn't.
Prepare: Build AI literacy across your community
- Train staff and students on prompts, verification, bias, and academic integrity. Treat AI use like a literacy, not a trick.
- Update curricula to emphasize reasoning, problem-solving, media literacy, and productive struggle.
- Write clear classroom and district AI use policies. Explain the why, the how, and the guardrails.
Protect: Safeguard cognition, relationships, and data
- Limit cognitive offloading. Require students to show reasoning, drafts, and reflections-not just final outputs.
- Choose vendors that meet strict data privacy standards, minimize data collection, and allow auditability.
- Keep the teacher-student relationship central. Use AI to augment, not replace, human feedback and connection.
Why blocking AI won't work
As one security expert noted, AI use is spreading fast across campuses. The biggest risk is cognitive offloading-outsourcing thinking to tools and losing the very skills schools exist to build. Blocking tools often drives workarounds and shadow use.
The better path: teach responsible use, set accountability, and implement AI usage controls where feasible for visibility and oversight. Schools will need a mix of policy, pedagogy, and technology-not a ban.
Practical steps you can start this term
- District leaders: Stand up a cross-functional AI task force (instruction, IT, legal, student support). Publish a one-page AI policy with use cases, red lines, and procurement criteria.
- School leaders: Run a 6-8 week AI pilot in two subjects. Measure student outcomes, teacher time saved, and integrity incidents. Share findings transparently.
- Teachers: Redesign one assignment per unit to require process evidence-notes, drafts, think-alouds, and reflections. Make AI use declarative: "If you used it, show how."
- EdTech teams: Vet tools for data minimization, opt-out controls, model transparency, and logging. Prefer local or education-specific models when possible.
- Families and students: Agree on "AI rules of use" at home. Encourage learning-first prompts (explain, hint, question) over answer-first prompts.
- Policy makers: Set baseline privacy and safety standards for K-12 AI tools. Fund educator training and independent impact evaluations.
A three-year commitment
The report urges every stakeholder to pick at least one recommendation to advance within three years. Small, evidence-based changes-multiplied across classrooms and systems-can shift outcomes fast.
AI will be part of school. The question is whether it builds stronger thinkers or shortcuts the process. That choice is ours.
Resources
- Brookings Center for Universal Education - research, briefs, and the full framework.
- Complete AI Training: Courses by Job - options to upskill educators on practical, responsible AI use.
Your membership also unlocks: