MagicSchool's founder on the potential and perils of AI
Adeel Khan saw what many missed in late 2022: AI could remove the grind without removing teacher judgment. He launched MagicSchool to do that job. Today the platform runs on ChatGPT and Claude, has raised $60 million, employs 160 people, and counts 20,000 schools on paid plans. A basic version is free.
Khan says 3.5 million U.S. educators have signed up, with roughly 700,000 active users in October. For context, there are about 4 million teachers in the country. Popular use cases: building worksheets, drafting feedback, and creating presentations.
Where AI actually helps: IEP drafting
MagicSchool includes an IEP generator. Khan's stance is blunt: teachers supply the deep knowledge; AI formats it into the document. In his view, the tool offloads paperwork so special educators can spend more time on students.
Some districts still turn the feature off. That's reasonable given the stakes. If you consider this route, set a clear workflow: teacher inputs the plan, AI formats it, teacher reviews for accuracy and compliance, and the team finalizes. If you want a refresher on the process itself, see the U.S. Department of Education's overview of IEPs here.
Guardrails against "cognitive offloading"
Khan worries about over-reliance too-by students and teachers. Inside MagicSchool, tools prompt teachers to think first: standards, context, constraints. That's intentional.
Adopt the same stance in your school: AI can draft, you decide. Require educators to set objectives, success criteria, and student context before using any generator. Make the teacher's judgment the bottleneck.
AI as a tutor: assist, don't replace
Can AI tutor? Khan's answer: yes and no. No-if you expect a bot to replace a teacher with videos and generic chat. Yes-if you treat it as an extension of the teacher's feedback loop.
One solid pattern: students write without AI assistance, then receive AI feedback aligned to a teacher-provided rubric. Students can ask follow-up questions and revise. That keeps the thinking where it belongs-on the student-while speeding up iteration.
Math is improving-still verify
Math remains a weak spot. Example: the system correctly explained the Pythagorean theorem, then accepted a wrong value for √120. Khan notes the platform now flags that its math computations aren't fully reliable. That's the right call.
Practical move: require exact answers with step checks, and verify with a calculator or separate math engine. Don't let AI "check" student math without a second source. Treat it like a junior TA-helpful, not final.
Is AI a net negative right now?
Khan leans yes-at least today. Shortcuts are easier than ever, and students know it. The fix isn't fear; it's clear norms and instruction on responsible use.
Teach the why behind constraints. For example, "no AI during first drafts" has a purpose: writing is thinking. Allow AI only after the thinking is on the page, to tighten organization, grammar, and evidence.
A practical playbook for school leaders
- Define red, yellow, green use cases. Red: no AI for first-draft writing, tests, or novel problem solving. Yellow: AI for brainstorming, outlines, and post-draft edits with attribution. Green: formatting, translation checks, rubrics, parent letters, and IEP document structuring with human review.
- Adopt a "think-first" policy. Every AI-assisted task starts with teacher inputs: goals, standards, student context, constraints, exemplars.
- IEP workflow. Teacher crafts the plan, AI formats, teacher verifies accuracy and compliance, team signs off. Protect student data and follow district policy.
- Math guardrails. Use AI for explanations and practice prompts, but verify computations separately. Require students to show steps and reasoning.
- Assessment integrity. Use in-class writing samples and oral checks to anchor grading. Compare AI-assisted work to baseline performance.
- Student education. Explicitly teach responsible use and long-term tradeoffs of outsourcing thinking. Make the norms visible in syllabi and rubrics.
- PD for staff. Short, recurring sessions: prompts that elicit better outputs, review techniques, bias checks, and data privacy basics. The U.S. Department of Education's AI guidance is a useful starting point here.
- Quality assurance. Spot-audit AI-generated materials monthly. Track time saved, student outcomes, and error rates. If it doesn't move learning or workload, cut it.
What MagicSchool is getting right-and where it must improve
What's working: teacher-centered design, prompts that force upfront thinking, and strong traction across schools. Where it needs work: math accuracy and airtight safeguards against misuse.
The bigger lesson: tools won't fix instruction. Clear rules, consistent modeling, and thoughtful use will.
Bottom line
AI can remove the busywork. It can speed feedback. It can structure complex documents like IEPs-so long as educators stay in charge. If you lead a school, make it explicit: AI assists; teachers decide; students think.
If you're building staff capacity for AI in education, you can explore focused training paths here or browse current course updates here.
Your membership also unlocks: