AI In NHPS Classrooms: Embrace, Resist, Or Redesign?
"AI is here, and we have to embrace it." That's the stance from NHPS Assistant Superintendent of Curriculum, Instruction, and Assessment Keisha Redd-Hannans. In the same breath, leaders warn it may be "the most dangerous tool" schools have seen. That tension is exactly where teachers, students, and administrators are working right now.
What's Actually Happening In Classrooms
Social studies teacher Ryan Boroski now runs every essay through multiple AI detectors (Scribbr, ZeroGPT, GPTZero). He knows they're imperfect, so he looks for strong agreement across tools before he talks with a student. In a recent assignment on Christopher Columbus, two of nineteen essays were fully AI-generated.
Students admit frequent use. Some say weekly, a few daily, and one called it "hourly." They use it to start drafts, check math with Photomath or Gauth.ai, and write cleaner emails. One described it as talking to "a really smart friend."
Teacher estimates vary, but many report 10-30% of submitted work shows signs of AI. The pull is obvious: instant help, less friction, and a quick path to a grade.
What The District Is Doing
Connecticut launched a semester-long AI pilot in seven districts in January 2025. New Haven then passed an AI policy in August 2025 that frames AI as a support for learning, not a replacement for teachers. It points to practical use cases: lesson planning, feedback, study guides, comprehension supports, and early drafting ideas.
NHPS is piloting two tools across eight schools. Clinton, East Rock, Engineering Science University Magnet Schools, and Hillhouse will try SchoolAI. Hill Central, John C. Daniels, Wilbur Cross, and Co-op will try Magic School AI. The goal: see what actually helps teachers and students without creating new problems.
Middle school ELA is also testing Writable through HMH Into Literature. Writable sends AI feedback to the teacher first, putting the adult in the loop. Early signs were promising: quicker feedback on grammar and structure, and a small bump in literacy coinciding with the new curriculum rollout.
Professional development is ramping up. Google led sessions on Gemini, NotebookLM, building reusable "Gems," and better prompting. Some board members want AI use to eventually be an expectation for staff. Redd-Hannans sees it as a skill students must have by the time they graduate.
Adoption Is Uneven (By Design)
Teacher approaches range widely. Some keep their proven methods and avoid new tools. Others are building AI into research, reading support, and lesson planning. Science teachers point students to search tools like Perplexity. World language teams use AI to differentiate texts. A principal even used Gemini to develop a school tagline.
English teacher Mercedes McKelvie models process-based use: outlines, rubric-aligned edits, idea organization, and creative tools like TextFX for figurative language. Her classes will write about responsible vs. irresponsible AI use, then practice clear disclosure. Her bet: teach students how to use AI well instead of pretending they won't touch it.
Student Behavior, Integrity, And Fairness
Some students admitted letting AI think for them in earlier grades, then pulled back after realizing the long-term cost to confidence. Others use AI as an editor and disclose how they used it. A student representative on the AI Policy Committee has already fielded complaints about false positives from detection tools.
Why do students lean on AI? Overwhelm, unclear tasks, weak relevance, too many responsibilities, and grade pressure. One Wilbur Cross teacher, Dario Sulzman, worries student writing starts to sound "soulless" the more they outsource it. He argues for protecting students' voice and for keeping education's purpose bigger than job training.
The Equity Question
New Haven is still digging out from Covid's impact. ELA proficiency is well below pre-pandemic levels. The district faces underfunding, high free-and-reduced lunch rates, and persistent vacancies. A recent study ranked Connecticut near the bottom on educational equity.
Redd-Hannans believes AI can level the field if used wisely. The risk is obvious, too: students with fewer supports could become more dependent and learn less. The outcome depends on the guardrails adults set and the quality of daily teaching.
What Educators Can Do This Semester
- Set a clear "AI use" policy by course. Spell out what's allowed (brainstorming, outlining, editing) vs. what's not (full drafts, math solutions without work). Require a brief "AI use note" on submissions.
- Grade the process, not just the product. Collect planning docs, outlines, drafts with revision history, and short oral defenses. Make it easier to show authentic thinking.
- Redesign a few key assessments. More in-class writing, cold writes, and conferences. Use local data sets, primary sources, and personal experiences that generic AI can't fake well.
- Teach AI skills explicitly. Show students how to prompt for clarity, ask for sources, and check claims. Model how to move from an AI scaffold to their own language and ideas.
- Use detectors as triage, not verdict. They're fallible. If flagged, confer with the student, review process evidence, and look for voice consistency across prior work.
- Differentiate with AI on your terms. Level texts, generate practice items, translate directions, and create sentence starters. Keep teacher judgment in the loop.
- Tighten privacy and tool approval. Stick to district-approved tools. Check vendor data practices, retention, and student age requirements. Avoid uploading sensitive student work.
- Offer real supports to reduce misuse. Clearer prompts, staged deadlines, exemplars, and time to revise. Students cheat less when the path to success is visible and supported.
- Share what works. Host short, team-led PD. Swap prompts, rubrics, and mini-lessons that saved you time or lifted student work.
If you want structured upskilling on prompts, assessment design, and AI classroom workflows, see these AI course collections by job.
Two Classrooms, Same Goal
In one room, Boroski runs a debate on MLK vs. Malcolm X using primary sources. No AI allowed. In another, McKelvie coaches students on using AI to edit with integrity and to think more clearly.
Different methods, same intent: help students think for themselves. AI can reduce friction and boost access, but it won't replace the trust and expectations that move learning. As one teacher put it, the work still comes down to human connection-and helping students build a voice that sounds like them.
Your membership also unlocks: