Northeast B.C. school district adopts clear AI guardrails for classrooms
School District 60 (Fort St. John, B.C.) has approved guidelines for using artificial intelligence in schools. Students can use approved tools under a teacher's direction, within the existing code of conduct, and without sharing personal information. The focus is simple: protect privacy and keep learning honest.
Why now
"AI is here and we just want to be proactive about it," said superintendent Stephen Petrucci. The district developed the guidelines over the past year with support from the province and other districts such as West Vancouver and Surrey. B.C. released guidance in 2024 to help schools set policies for safe and responsible use of AI.
If you're looking for the provincial context, see B.C.'s K-12 guidance on generative AI from the Ministry of Education and Child Care here.
What the guidelines say
- Use only district-approved tools (Google Gemini, Microsoft Copilot) and only under teacher direction.
- Do not include personal information in prompts or uploads.
- Check AI outputs for mistakes and bias before using them.
- Acknowledge when AI helped with an assignment or project.
- Respect Indigenous knowledge-oral traditions and context may be misrepresented by AI.
- Use AI to support inclusion and accessibility in the classroom.
Petrucci described the guidelines as "guardrails" meant to move beyond fear of cheating and into constructive use. "It's really safety first… and then looking at the educational uses of it."
Rollout and training
District-approved tools-Google Gemini and Microsoft Copilot-are already available to administrators and teacher leaders. The plan is to roll out to all staff, then students, over the next year once training and supports are in place.
"When they use it, it's under the umbrella of the district's security screens," Petrucci said. Board chair Helen Gilbert added that best practices for training and learning are in development: "We have early adopters and we have some that are going to end up pulled along." Parents, she said, want clear rules that keep kids safe and prevent sharing anything that could be traced back to them.
What families are saying
Outgoing trustee and parent of four, Thomas Whitton, noted that AI is already part of daily life: "When they say, 'Hey, Google,' or, 'Hey, Alexa,' they're talking to artificial intelligence." His family doesn't use AI to do homework, but he supports teaching students how to use it well-without oversharing or adding to their "digital tattoo."
"Hopefully we can give them the guidance… to use it properly - to not put personal information out… to make sure you're putting information out there that is healthy and grows your knowledge base."
Practical steps for school leaders and teachers
- Post a one-page classroom AI policy: allowed uses, crediting requirements, and privacy rules.
- Teach prompt hygiene: remove names, locations, student IDs, and any identifiable details.
- Require a short "AI assistance" note on assignments that used AI (what tool, for what task).
- Build a routine for accuracy checks: verify facts, cite sources, and flag potential bias.
- Consult local Indigenous educators to set clear boundaries on cultural topics and context.
- Schedule PD on Gemini and Copilot features, data protections, and classroom workflows.
- Create sample tasks that use AI for brainstorming, outlines, accessibility, and language support-without replacing original student thinking.
Helpful resources
- Microsoft Copilot for Education overview (data protections, scenarios): learn.microsoft.com
- B.C. guidance on educational technologies and AI: gov.bc.ca
If your team needs structured training to get staff ready, explore curated AI courses by job role here.
Bottom line: clear rules, teacher-led use, and a safety-first mindset give students the benefits of AI without risking privacy or academic integrity. Create clarity now, and the classroom gets easier to manage later.
Your membership also unlocks: