Harvard's AI Education Debate: Gardner and Roberts on Teachers as Coaches
AI is central to learning and assessment; Gardner and Roberts urge rethinking curriculum and teacher roles. Pilot tools, teach AI literacy, update assessment, and ensure equity.

Education in the Age of AI: Key Takeaways from the HGSE Forum
Educators packed Askwith Hall at the Harvard Graduate School of Education to hear Professor Howard Gardner and visiting Harvard Law professor Anthea Roberts discuss how artificial intelligence will change learning. The message was clear: AI isn't a side tool anymore; it's a core part of how we'll teach, learn, and assess.
Two perspectives you can use
Anthea Roberts offered a confident view of AI's value in learning. She called large language models "exceptional" and built an AI tool, DragonFly Thinking, to help people examine complex issues from multiple perspectives. It's being piloted by government agencies in Australia and nudges users to look through different lenses: stakeholders, risks and rewards, and domain experts.
Howard Gardner took a measured stance. He expects AI to change both K-12 and higher education, including what we teach and how we assess. He questioned the idea that students must march through every traditional discipline in the same way, suggesting many knowledge tasks will be handled better by AI-freeing educators to prioritize higher-order thinking.
Curriculum will change
Expect a shift from survey-style coverage to problem framing, inquiry, and project work. Content isn't going away, but breadth-for-breadth's-sake will give way to applied understanding and perspective taking.
Plan for more comparative reasoning, synthesis across subjects, and frequent reflection on process-how students arrived at answers, not just the answers themselves.
The teacher's role moves toward coaching
With AI offering personalized support, teachers can focus on feedback, judgment, ethics, and context. The idea that everyone completes the same tasks in the same way will feel outdated.
Expect more differentiated paths, with teachers guiding goal-setting, curating resources, and evaluating quality-skills AI can't fully replace.
What to do now: practical steps for schools
- Draft clear policies on AI use: disclosure in assignments, acceptable support, academic integrity, and data privacy.
- Pilot AI in low-risk areas first: lesson planning, rubric-aligned feedback, and idea generation. Use opt-in cohorts and measure outcomes.
- Teach AI literacy: prompt strategy, source checking, bias awareness, and citing AI contributions.
- Adopt multi-lens analysis in projects: stakeholder mapping, risk/benefit trade-offs, and cross-disciplinary viewpoints (echoing Roberts' approach).
- Redesign assessments: more oral defenses, process logs, annotated drafts, and portfolios to make thinking visible.
- Invest in staff training and micro-credentials so educators stay current and confident. Explore AI courses by job for structured upskilling.
Equity and guardrails
Address access, bias, and privacy early. Ensure students have equitable tools, set guidelines for responsible use, and audit outputs for bias. Align with district policies and document classroom practices to protect both students and teachers.
Bottom line
Roberts highlights AI's strength in expanding how we think; Gardner reminds us to rethink what we teach and why. That mix-bold experimentation with clear judgment-should guide your next moves.
If you want to track ongoing work in this space, follow the Harvard Graduate School of Education and similar research hubs. Then pilot, measure, and iterate in your own classrooms.