Teachers use AI for lesson planning but question its value for student learning

Teachers have found AI useful for drafting lesson plans and newsletters, but most still can't answer a basic question: what does it actually do for students in the classroom? Three years in, cautious and selective beats wholesale adoption.

Categorized in: AI News Education
Published on: Mar 28, 2026
Teachers use AI for lesson planning but question its value for student learning

Teachers Ask a Practical Question About AI That Schools Still Can't Answer

A fourth-grade math teacher posed a straightforward question: "What can I actually use this for in math?"

That question captures where classroom AI adoption stands three years into the generative AI era. Schools have faced pressure to respond to tools like ChatGPT with limited guidance and considerable hype. Some administrators framed the technology as transformative for teaching. Others warned of disaster. Yet in many classrooms, adoption has moved slower and more selectively than the surrounding noise suggested.

Teachers are often characterized as resistant to innovation when they hesitate. But educators behave like professionals in most fields when encountering new technology: they ask whether it solves a real problem. A research project involving 17 teachers from third through 12th grade found that teachers are not rejecting AI. They are simply maintaining boundaries around where it belongs in their work.

Productivity Gains, But Not in the Classroom

Teachers have found immediate value in generative AI for administrative tasks. An engineering teacher in New Jersey said he uses AI routinely to draft lesson plans that schools require but few read. A colleague described colleagues using it for newsletters and planning documents.

This pattern mirrors how professionals across industries adopted the technology. Generative AI excels at drafting, summarizing and generating text-tasks that create time pressure for educators juggling grading, parent communication, and reporting alongside instruction.

But administrative efficiency differs from classroom instruction. When teachers consider introducing AI to students during lessons, they calculate differently. The question shifts: What learning problem does this tool solve?

The Instructional Gap Remains Unsolved

Many educators still cannot answer that question after years of exposure to generative AI. Some experiment cautiously. A science teacher from Guam has students write first drafts, then feed them into ChatGPT for revision-but discourages AI for research.

Others teach AI itself as the subject. A high school special education teacher in New York deliberately trained a chatbot incorrectly so students could understand that data quality depends on how systems are trained. Learning science research supports this approach: students benefit most when technology supports reflection and revision, not when it replaces the cognitive work of thinking and problem-solving.

Teachers in the study did not treat AI as a source of authoritative knowledge. They treated it as something to analyze and critique.

AI Literacy as a Classroom Entry Point

The clearest instructional opportunity many teachers see is AI literacy. International guidance from UNESCO and the OECD increasingly frames understanding algorithmic systems as a foundational skill. Students already navigate environments shaped by algorithmic feeds and recommendation engines.

An elementary teacher in New York focuses on teaching students how to prompt AI systems and fact-check their outputs for bias. A middle school teacher uses a peanut butter and jelly sandwich exercise to explain machine learning: ingredients become datasets, procedure becomes algorithm, output depends on design.

These lessons treat AI as a window into how digital systems generate knowledge, not as a productivity shortcut.

Hallucinations and Bias Create Trust Problems

Teachers consistently raised concerns about reliability. An elementary librarian noted that ChatGPT fabricates information entirely. A high school French teacher said the tool works only if you already know your subject well enough to catch errors.

Others connected these risks to algorithmic bias. A computer science teacher at a school serving large populations of African American and Latino students pointed to hiring data, incarceration data, and facial recognition systems where error rates vary by demographic group. In these contexts, AI becomes a case study in how technological systems shape information rather than a tool for finding answers.

Pragmatic Indifference, Not Resistance

Teachers are adopting what might be called pragmatic indifference. One said: "I use it for lesson planning… but I don't really use the lessons." Another said: "I push students not to use it for research."

Teachers use AI where it clearly saves time while maintaining boundaries around core learning tasks. This reflects professional judgment about what students need to practice: deep reading, careful writing, reasoning through problems, evaluating evidence.

If a tool primarily reduces the need to perform that work, teachers have reason to question whether it advances learning or undermines it.

Schools exist partly to create conditions for complex cognitive work. The fourth-grade teacher's question-what can I use this for in math?-remains the right one to ask. Until educators and schools can answer it clearly, pragmatic indifference is the professional response.

For educators exploring how to address this gap, AI learning paths for teachers offer practical frameworks for understanding where these tools fit in instruction. Schools looking at broader implementation can reference resources on AI for education to ground decisions in evidence rather than hype.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)