AI in Education: A Clear Plan From Vision to Safe Delivery
At the UK AI for Education Summit in London, the Education Secretary set out a simple idea: connect what has always worked in education - great teaching, curiosity, and collaboration - with practical uses of AI that raise outcomes for every learner.
The goal is bold but grounded. Use AI to make learning more personal, boost achievement for disadvantaged pupils and those with special educational needs and disabilities, and give teachers back time for the human work only they can do.
What Will Change - And What Won't
AI can personalise support at scale. Every student can get timely feedback. Every lesson can adapt to real needs.
But one thing stays the same: a teacher at the centre of every classroom. Students asked for exactly that in recent research with over a thousand young people - they value personal attention from teachers and social interaction with friends. AI should support those relationships, not replace them.
A Joint Effort Across Government and Industry
Education will partner with national science and technology teams to move faster, bringing innovation into classrooms with care and evidence. The department has already worked with major companies like Google and Microsoft to establish clear expectations for safe use in schools, backed by resources to help teachers use AI well.
The ask from government is straightforward: build tools that improve learning, respect child development, and meet strong safety standards. No gimmicks. No features that distract from learning.
Updated Safety Standards for Schools
New measures focus on doing what is effective, safe, and child-centred. Four practical guardrails stand out:
- Mental health first: Systems must route pupils to human support where needed. Unregulated chat that risks self-harm signals is out of bounds.
- Protect thinking, not shortcuts: Tools should prompt genuine effort before giving help. Encourage learning, don't spoon feed answers.
- Support social and emotional development: Especially for young pupils and those with SEND, AI must not displace real human interaction. Human-like mimicry that invites unhealthy trust or disclosure is restricted.
- No persuasive or exploitative design: No features that keep pupils on screens longer than required for learning. Education comes first, engagement mechanics second.
Keep Purpose Front and Centre
There's a warning worth hearing. Complexity can become its own trap. Dickens wrote about that just streets away in Bleak House. For AI in education, the mission is clear: lift learning, support teachers, and widen opportunity - especially for those who have the most to gain.
What This Means for Schools and Trusts
- Adopt with intent: Choose tools that improve learning, save teacher time, and meet the updated safety standards.
- Protect the human layer: Use AI to free teachers for explanation, coaching, and connection - the work that changes lives.
- Prioritise SEND and disadvantage: Focus AI pilots where the gains could be greatest. Measure impact on access, progress, and engagement.
- Build staff confidence: Provide clear guidance on prompts, feedback, assessment integrity, and safeguarding.
What This Means for EdTech Companies
- Design for learning, not stickiness: Avoid persuasive loops. Make "less screen, more learning" a feature.
- Make thinking visible: Encourage pupils to attempt answers before help. Show reasoning, not just outputs.
- Respect developmental science: Do not mimic human social cues in ways that blur boundaries with children.
- Ship safety by default: Human escalation paths, age-appropriate experiences, and clear data practices are non-negotiable.
How To Get Started This Term
- Run a focused pilot in one subject and year group with a clear success metric (e.g., quality of feedback, time saved, or progress on a specific skill).
- Use an approval checklist aligned to the safety standards above before any new tool enters the classroom.
- Set classroom norms: attempt first, then ask AI; cite AI use; teacher reviews remain the final word.
- Include students and parents in review cycles. Publish what you're testing, why, and what you're learning.
Evidence, Not Hype
This is not about chasing trends. It's about repeatable gains in learning and teacher workload. Independent guidance is growing - for example, UNESCO's work on practical guardrails for generative AI in education is a useful reference point.
Read UNESCO guidance on generative AI in education
A Practical Next Step for Professional Development
If you are planning CPD on AI use in schools, curated courses by role can speed up adoption while keeping teams aligned on safe practice.
Bottom Line
AI can make education more personal, more humane, and more effective - if we keep teachers at the centre and safety at the core. The standards are rising. The bar for impact is clear.
The next move is ours: pilot, measure, improve, and share what works. Let's build tools and classrooms that help every child learn more, think deeper, and thrive.
Your membership also unlocks: