"No longer just kids bullying kids": Jason Clare warns schools about AI-driven abuse
Australia's Federal Education Minister Jason Clare is sounding the alarm: AI is accelerating bullying in and beyond school. This isn't just playground taunting. It's face-swap deepfakes, anonymous chatbots, and abuse that follows students home - 24/7.
The government has announced a $10 million national plan to respond, including tighter timelines for schools and new tools for staff, students, and parents. For educators, the job is clear: act fast, document, and escalate.
What's new - and why it matters
- AI-boosted abuse: Face-pasting students onto explicit images, then circulating them at scale.
- Hostile chatbots: Reports of bots messaging students with harassment and self-harm prompts.
- Always-on exposure: Bullying now happens day and night, on social apps and private messaging.
What the national plan funds
- $5 million for a national awareness campaign.
- $5 million for new resources for teachers, students, and parents.
- 48-hour response expectation: Schools are required to act on bullying complaints within two days.
Immediate actions for school leaders and teachers
- Implement a 48-hour workflow: Intake, triage, parent contact, documented action, and follow-up.
- Preserve evidence: Secure screenshots, URLs, timestamps, and platform details; log everything.
- Activate wellbeing support: Prioritise risk checks the same day. Escalate to school psychologists or counsellors.
- Engage parents early: Share what you know, what you're doing, and when you'll update next.
- Work with platforms: Report accounts, takedown deepfakes, and request urgent moderation.
- Teach digital self-defence: Blocking, reporting, privacy controls, and "don't forward harmful content."
- Clarify AI use: Update policy on AI tools, image generation, and chatbot access on school networks.
Handling AI chatbots and social platforms
The government's move to delay social media access under 16 will help, but schools still need guardrails. Block unapproved chatbots on school devices, review filters, and monitor for image-based abuse. Teach students to treat unknown bots like strangers: don't engage, don't share, report immediately.
What to tell students
- Do not reply to abuse. Report it to a trusted adult and the platform.
- Capture evidence safely (screenshots, links) and hand it to staff.
- Do not share or "joke-forward" harmful content - it multiplies harm.
- Check in on peers who may be targeted and connect them with help.
Partner with parents
- Agree on home-school communication channels for quick updates.
- Share practical guides on privacy settings, reporting, and device rules.
- Encourage device-free sleep and visible charging to reduce after-hours exposure.
Staff readiness checklist
- Run short training on identifying deepfakes and image-based abuse.
- Refresh acceptable use policies to cover AI tools and consequences.
- Drill the 48-hour response: who does what, and in what order.
- Nominate a single point of contact for complex or high-risk cases.
Useful resources
- eSafety Commissioner: Cyberbullying guidance and reporting
- Kids Helpline (24/7): support for young people and families
Upskilling your team on AI literacy
If your staff need fast, practical training on AI safety, policy, and classroom use, explore role-based options here: Complete AI Training - courses by job.
Bottom line
Bullying has changed. Response times and evidence handling now matter as much as pastoral care. With a 48-hour mandate, clearer AI policies, and stronger partnerships with parents and platforms, schools can reduce harm and protect learning time.
Your membership also unlocks: