Investigation underway into AI-made sexualized images at Calgary junior high: What educators need to know
Administrators at Twelve Mile Coulee School in Calgary have alerted families to reports that some students used artificial intelligence tools to create sexualized images of peers and shared them on social media. The Calgary Police Service's School Resource Officer team is involved, and police confirm an active investigation with no charges laid at this time.
School officials emphasized their responsibility to keep a welcoming, caring, respectful, and safe environment. They indicated that any threats or harmful behavior connected to the school community will be investigated and may lead to school-level discipline and police action.
Key points from the school's response
- AI-generated sexualized images were created and shared by some students outside of school hours.
- Administration is working with the CPS School Resource Officer team; the investigation is ongoing.
- Families were urged to review students' apps, online presence, and device security settings.
- Officials stressed that inappropriate language, images, or behavior toward students or staff are not harmless pranks.
Context for educators
Low-barrier image tools make it easy for teens to manipulate photos and spread them quickly. This incident follows a December case where a Calgary teen was charged after allegedly using AI to sexualize images of girls from multiple high schools.
For schools, this is a student safety issue, a policy issue, and a staff readiness issue all at once. Responses need to be clear, quick, and coordinated across administration, counseling, IT, and family communication.
Immediate actions schools can take
- Coordinate with your SRO or local police liaison to understand reporting thresholds and evidence preservation.
- Activate your student safety and digital conduct protocols; document timelines, decisions, and contacts.
- Preserve evidence: instruct involved students and families not to delete messages or images; capture screenshots with timestamps where appropriate.
- Provide victim-centered support: counseling, discreet schedule adjustments, and safe-reporting channels (anonymous if available).
- Send a concise family update with plain-language guidance on device checks, privacy settings, and reporting steps.
- Re-teach digital citizenship: consent, image manipulation, bystander responsibilities, and the real consequences of "jokes."
- Review supervision touchpoints: locker rooms, clubs, field trips, and any spaces where image capture commonly happens.
- Align with your board's code of conduct and escalate consequences consistently; keep records for potential police follow-up.
- Audit staff readiness: who knows how to collect digital evidence, contact platforms, and support impacted students?
Guidance you can share with parents
- Check phones and social accounts regularly; review app permissions and privacy settings.
- Talk with your child about consent and digital footprints; ask what they're seeing in group chats and how they respond.
- Encourage reporting to a trusted adult if they're targeted or witness harm; do not forward harmful images.
- If an image is circulating, preserve evidence and report promptly to the school and police.
Reporting and removal support
- Report child sexual exploitation or non-consensual images to Cybertip.ca for triage and takedown support.
- Find prevention resources and guidance through the Canadian Centre for Child Protection.
Policy, training, and prevention
- Update policies to address AI-specific misconduct: deepfakes, altered media, and distribution. Define offenses, procedures, and consequences.
- Build staff capacity on AI basics, digital evidence handling, and trauma-informed response.
- Embed short, recurring lessons on consent, misinformation, and media manipulation into advisory or health classes.
- Establish a simple, private reporting path (QR code form or email alias) that students actually use.
- Run a parent session covering device safety, social platforms, and school processes for handling incidents.
Professional learning
If your team needs a fast primer on AI capabilities, risks, and classroom guardrails, consider structured training to build shared language and expectations. Explore role-based options here: AI courses by job.
Bottom line for educators: act quickly, protect students, and make the process visible. Clear protocols, steady communication, and consistent follow-through deter copycats and rebuild trust after incidents like this.
Your membership also unlocks: