Support, not replace: how a Dorset primary teaches AI as a fallible assistant-and what that means for governance

Teach AI like a lab partner-helpful, but fallible. Do that and you cut busywork, deepen thinking, and set fair rules that respect teachers, students, and creators.

Categorized in: AI News Education
Published on: Oct 25, 2025
Support, not replace: how a Dorset primary teaches AI as a fallible assistant-and what that means for governance

AI skills: what a primary school can teach us about AI governance

A south coast primary school offers a simple lesson: teach AI as a fallible assistant, not a magic wand. That single shift reshapes student learning, staff workload, and AI governance in one move.

It also exposes a hard truth. AI used uncritically becomes shadow IT, drains trust, and fails at scale. Used deliberately, it returns time, deepens thinking, and raises standards.

Teach AI like a lab partner, not a cheat code

At Powerstock Primary in Dorset, headteacher Nick Harris set a clear expectation: students must challenge the machine. Year 3 and 4 pupils tested AI's reading and reasoning, and celebrated when it got their school's pupil count wrong because the training data was out of date.

That moment matters. Kids learned that AI can be helpful and still be wrong. Curiosity went up. Blind trust went down.

Support teachers, don't replace them

The school's strategy also targets staff workload. Lesson prep time dropped. Teams used that space for shared planning and creativity instead of admin.

They rebuilt their curriculum with AI in the loop and teachers in control. As one staff member put it, "For the first time in years, I had the space to think." That's the point: give teachers time to teach well, not to chase tasks.

AI is a system change, not a single tool

Les Hopper, Product Director at Pearson, puts it plainly: AI isn't one tool. It's a set of disruptions that stretch across content, workflows, platforms, and assessments. Education works the same way-interconnected and sensitive to ripple effects.

Introduce AI in teaching without considering application and assessment, and you create gaps elsewhere. Change one part, you change the whole.

Consent, credit, and the cost of "free" data

There's a bigger issue behind the classroom: the data that fuels AI. Many systems are trained on scraped human work. If that continues without informed consent, credit, or fair payment, we're building learning on an unfair foundation.

If education is going to model digital citizenship, it has to address attribution, licensing, and ethical use. Otherwise, we teach students to benefit from work that creators were never asked to contribute.

AI can speed tasks and still thin out thinking

Hopper's warning is useful: AI can help you skip the hard parts-critical thinking, meaning-making, retention. It can generate a summary without understanding, an answer without reasoning.

Student adoption is high, but depth of use isn't equal. Wealth and gender gaps exist. If we ignore that, AI becomes another divider.

A practical playbook for schools

  • Set a "fallible assistant" policy: AI can draft, suggest, and critique. It cannot be the final author, the final answer, or the moral authority.
  • Teach AI literacy early: Fact-checking, source requests, cross-referencing, and spotting hallucinations are core skills.
  • Redesign assessment: Use oral defenses, process journals, and source annotations. Require students to mark AI-assisted sections and explain why they used it.
  • Make thinking visible: Ask for step-by-step reasoning, not just outputs. Reward method over speed.
  • Protect time for staff: Co-plan units with AI support, then review together. Share prompts that work. Standardize what "good" looks like.
  • Set privacy rules: No personal data, sensitive student info, or copyrighted materials in prompts. Use age-appropriate, safeguarded tools.
  • Close equity gaps: Provide devices where possible, create quiet workspaces, and offer offline alternatives for AI-dependent tasks.
  • Model copyright ethics: Teach licensing, citation, and data consent. Discuss how models are trained and why attribution matters.
  • Track impact: Measure time saved, quality of student reasoning, and incidents of misuse. Iterate based on evidence, not hype.
  • Invest in staff development: Short, frequent training beats one-off workshops. Build a community of practice across subjects.

Classroom guardrails that work

  • Always verify claims with two trusted sources.
  • Ask AI for sources and check them. No sources, no submission.
  • Use AI for ideation, outlines, and critique-never as the final draft.
  • Label AI use clearly. Explain what was used and why.
  • Don't upload personal information or third-party work you don't own.
  • Prefer explainable steps over flashy outputs.

Leadership questions to ask this term

  • What tasks will AI support for staff, and what stays strictly human?
  • How will students learn to evaluate AI outputs-and show their reasoning?
  • Where are the biggest privacy risks in our current tools?
  • How will we address access gaps between students?
  • What's our policy for AI citations and disclosures?
  • How do we handle copyright, licensing, and fair use in prompts and outputs?
  • What metrics will tell us if AI is improving learning, not just saving time?
  • Who owns and audits our AI prompts, outputs, and workflows?

A note on Bloom's-and why it matters now

AI challenges the old order of recall, understand, apply, analyze, evaluate, create. If a tool can draft and "evaluate," students may skip the foundational steps that build real understanding. That's on us to fix with assessment and instruction.

For a refresher on the levels and verbs you can use in tasks, this guide is clear: Bloom's Taxonomy (Vanderbilt University).

Policy winds are shifting-schools can't drift with them

Some leaders now prioritize AI's opportunities over safety. That's fine for press releases, not for classrooms. Schools need a stance rooted in ethics, evidence, and child development-not vendor marketing.

Look to independent guidance for a steady baseline: UNESCO's recommendations on AI in education.

The hard part we can't ignore

In underfunded schools, AI risks becoming a substitute teacher for students with devices, while essential skills go untaught. That's a short-term fix that creates long-term gaps.

The Dorset example shows a better way: teach students to challenge AI, give teachers time back, and build policy that respects people and the work they create.

Bottom line

Treat AI as support, not a shortcut. Teach it critically. Respect data and creators. Think in systems, not tools. Do that, and you'll raise outcomes without losing the human parts of education that matter.

Next steps for professional learning


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)