Wake County student pushes for AI detection safeguards after false cheating accusation
A Green Hope High School freshman was accused of using generative AI to write an English assignment. She wasn't. Now Eleanor Canina is demanding the school district establish clear policies on how teachers can challenge students over suspected AI use - and how students can fight back.
Canina's petition, which has gathered 87 signatures, highlights a growing problem in schools: AI detection tools produce unreliable results, yet teachers use them anyway to flag suspected cheating. The accusation stressed her out. The uncertainty about how to respond frustrated her more.
"Being falsely accused of something like that is stressful, frustrating, and scary," Canina said in her petition. "It can lead to unfair punishment, damage a student's reputation, and make students feel like their hard work doesn't matter."
How the accusation unfolded
Canina's English class lacked consistent instruction after her original teacher left. A substitute teacher graded her work and ran it through three AI detection tools. The results: 62%, 75%, and 87% likelihood of AI generation or "significant AI assistance."
The teacher offered her an alternative assignment for full credit. Canina refused, saying she shouldn't have to do extra work for work she actually completed.
In emails, the teacher acknowledged he didn't know her writing style well enough to judge fairly. "Given the current situation regarding the course's lack of direct instructional contact, I am relying on the evidence available to me," he wrote.
Another teacher later regraded her assignment using version history - a feature that shows exactly what was written when. That teacher confirmed Canina had written the work herself.
The detection problem
AI detection tools are notoriously unreliable. North Carolina's Department of Public Instruction explicitly warns against using them as the sole basis for cheating accusations. The tools frequently flag false positives and false negatives.
Wake County Schools does not provide or require AI detection tools. Instead, the district tells teachers to use multiple measures: reviewing a student's writing process, checking work history, and relying on professional judgment.
Yet individual teachers still access tools like Copyleaks and ZeroGPT on their own, according to students at the school. Without district-wide policy, teachers operate under different rules.
What the district says
Wake County has been drafting an AI policy since last year but hasn't finalized it. The school board plans to revisit the process this spring.
Board Member Chris Heagarty said the district should roll out a policy in sections rather than wait for a comprehensive one. "We're seeing more and more problems where despite all of the potential we see with AI, we really need to adopt a comprehensive AI policy," he said.
District spokesperson Sara Clark said Wake County provides AI training to staff through WakeLearns and on professional development days. She said the district is "actively evaluating adopting policy and guidance related to artificial intelligence" while listening to feedback from students, families, and staff.
What Canina is asking for
Her petition calls for three things: clear guidelines on when and how AI detection tools are used, transparency about that process, and a formal appeals procedure for accused students.
Canina told the school board that students lack an easy way to respond to false accusations. She asked for protections.
"This experience made me realize the growing prevalence of the issue," she said. "Students being accused of using AI on their work when they didn't needs to be treated as a real concern."
The board's chief of staff and chief of academic advancement were asked to meet with Canina after her public comments.
The broader challenge
Teachers face genuine pressure to catch AI misuse. Some students do use generative AI to complete assignments they shouldn't. But detection tools can't reliably tell the difference between a student's own writing and AI-generated text.
North Carolina's guidance frames this as a teachable moment, not a gotcha. "If there is suspicion that a student depended on AI too heavily for an assignment, this should be viewed as a teachable moment to reinforce the appropriate partnership with AI tools rather than a 'gotcha' moment."
That requires teachers to understand how AI works and how to evaluate student work fairly. For educators seeking to build that knowledge, resources like the AI Learning Path for Teachers can help clarify how to work with AI in the classroom without relying on flawed detection tools.
Without clear district policy, teachers and students are left guessing at the rules.
Your membership also unlocks: