ETH Zurich researcher calls for nuanced approach to AI in children's education

AI belongs in classrooms, but only after students master the underlying skills it can shortcut. The debate isn't whether to use it-it's already there-but when and how.

Categorized in: AI News Education
Published on: Apr 09, 2026
ETH Zurich researcher calls for nuanced approach to AI in children's education

AI in Schools Requires Balance, Not Ideology

Education researchers are pushing back against both fierce resistance and uncritical enthusiasm for AI in classrooms. The real question isn't whether to use AI - it's already everywhere - but how to use it effectively without replacing essential cognitive work.

Martina Rau, a professor of learning and instruction at ETH Zurich, said the conversation about AI in education has become polarized. "Neither camp can say in simple terms whether AI helps or hinders learning," she said. Good teaching depends on pedagogy, not whether tools are digital.

Skills Students Must Master Alone

Certain foundational abilities require students to work without assistance. Writing is a primary example. The act of composing develops logical thinking and helps organize ideas. Only after students learn to write independently does it make sense to teach them how AI can improve their drafts.

This principle extends beyond writing. Students need to understand how software works before relying on it to analyze data. They should formulate explanations in their own words before using AI to check their answers.

AI as a Tool, Not a Shortcut

AI functions well as a cognitive tool - something that reduces mental strain without replacing thinking. Just as a knife on a kitchen counter serves as a memory aid, AI can generate practice materials or extract difficult vocabulary from texts for flashcard drills.

The risk emerges when students use AI to avoid difficult cognitive tasks. Teachers should treat this the same way they would treat any cheating - by addressing it directly and, if necessary, restricting AI use in assignments.

One documented problem is "the illusion of knowing." Students read AI-generated explanations and believe they understand a concept when they don't. True understanding requires students to explain ideas themselves, apply them practically, and transfer concepts to new problems.

When AI Gets It Wrong

AI hallucinations and errors are teaching opportunities. When AI produces incorrect information, classroom discussion about those mistakes reinforces an essential point: AI cannot be trusted blindly.

To improve AI feedback, teachers should direct it to specific sources - course materials or reputable websites - so it has reliable reference points. This practice makes AI more useful while teaching students to question outputs.

Parents as Context Providers

Parents who explain how they use AI in their own work offer children valuable perspective. Adults who built knowledge before widespread AI can interpret and contextualize what AI produces in ways younger people cannot.

Schools should treat AI as a genuine classroom topic, not something to hide or pretend doesn't exist. This approach connects education to students' actual lives and the world they'll enter.

For more on practical classroom implementation, see AI Learning Path for Teachers and explore AI for Education resources.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)