Use AI to teach better - but keep humans in charge
At the 103rd annual conference of the Association of Heads of Anglo-Indian Schools in Kolkata, leaders pressed a clear message: bring AI into the classroom to make learning more engaging, but don't hand over the steering wheel. One session focused on AI-enabled curriculum design; the next underscored the rising risk of AI-driven cyberattacks-and the need for tighter policies and training.
AI in the classroom: helpful, with limits
Speaking at La Martiniere School, IIT-Kharagpur alumnus and AI expert Rajiv Agarwal said, "AI is our friend, but sometimes it gives wrong answers very confidently. We should take information from AI but not depend on it completely." His talk showed how AI can personalize learning paths, auto-check routine homework, and use interactive content to keep students engaged.
He likened it to making study time feel like "tasty, spicy food." When teachers mix humor, clear goals, and the right tools, marks improve because students want to show up and try. The warning: "Too much dependence on AI means we give away our creativity, critical thinking, and ability to judge right from wrong⦠Humans must always stay in control."
Where AI fits-and where teachers should step in
- Personalized practice: Use AI for differentiated exercises, spaced repetition, and hints-then review tough concepts live.
- Feedback at scale: Let AI draft formative feedback on objective tasks; teachers refine and add context.
- Interactive content: Quizzes, simulations, and visual explanations can boost attention-pair them with short class reflections.
- Planning support: AI can suggest lesson structures or question banks; teachers align them to curriculum goals and student needs.
Guardrails to prevent overdependence
- Written AI policy: Define allowed tools, age-appropriate use, citation norms, and what requires teacher approval.
- Human-in-the-loop: Require teacher review for AI-generated feedback, grades, or content that informs placement decisions.
- Assessment integrity: Use oral defenses, drafts with revision notes, in-class writing, and project artifacts rather than relying on AI-detection tools.
- Critical reading: Teach students to fact-check AI outputs, verify sources, and explain "why" not just "what."
- Data minimization: Don't upload identifiable student data to public tools; prefer vetted, school-managed platforms.
For broader guidance on responsible classroom use, see UNESCO's recommendations on generative AI in education: UNESCO guidance.
Cybersecurity: write it down, lock it down
In a follow-up session, Additional Commissioner of Police Pranav Kumar cautioned that cyberattacks are becoming more automated, scalable, and faster with AI. He urged school heads to keep a clear, written policy that's widely understood-what tools are allowed, how data and accounts are used, and who is responsible for what.
He stressed priorities: protect sensitive information of students, parents, and staff; keep school systems and services running; safeguard financial transactions (fees, payments, vendor dealings); and defend institutional identity, reputation, and trust. He also called for immediate reporting of incidents and regular training so staff know how to respond.
The minimum security baseline for schools
- Access control: Use role-based access in your ERP, enforce multi-factor authentication, and remove unused accounts promptly.
- Device health: Keep systems patched, use endpoint protection, and encrypt staff laptops and portable drives.
- Email and identity: Enable SPF, DKIM, and DMARC; train staff to spot phishing; verify vendor bank detail changes by phone.
- Backups and continuity: Keep offline backups (3-2-1 rule) and test restores. Document an incident runbook.
- Data handling: Classify data, restrict sharing, and set retention timelines for student records and CCTV footage.
- Vendor checks: Review contracts for data protection, breach reporting, and security standards before adoption.
- Network hygiene: Segment admin, student, and guest Wi-Fi; change default passwords; log and monitor access.
- Awareness cadence: Quarterly drills, phishing simulations, and short refreshers for all staff-including substitutes.
For incident support and advisories, refer to CERT-In: cert-in.org.in. Report cybercrime promptly at the national portal: cybercrime.gov.in.
Cyberbullying and blackmail: act early
With more cases involving minors and low reporting, schools need clear pathways to speak up without fear. Early reporting protects students and preserves evidence.
- Publish a simple reporting flow for students, parents, and staff, with confidential options.
- Escalate promptly to counselors and designated safeguarding leads; involve law enforcement when required.
- Train students on consent, privacy, and bystander action; schedule recurring awareness sessions.
- Preserve messages, screenshots, and logs; avoid confronting alleged offenders directly.
90-day action plan for school heads
- Days 1-30: Draft or update your AI and cybersecurity policies; inventory systems and data; enable MFA for staff.
- Days 31-60: Roll out teacher training on AI use and assessment integrity; segment Wi-Fi; set up tested backups.
- Days 61-90: Pilot AI in two subjects with clear guardrails; run a phishing drill; conduct a tabletop incident exercise.
If your staff needs quick, practical upskilling on AI for classroom use and policy implementation, see curated options here: AI courses by job.
Your membership also unlocks: