AI in the Classroom: 4 Lessons From Stanford's Jon Levin and MIT's Mark Gorenberg
AI is helping students write. That's clear. The concern from Stanford University President Jon Levin is what comes next: will students stop reading because the machine can summarize anything?
"Maybe that's the future? I don't know. To me it's very sad. … Something about immersing yourself in a book. … I hope this next generation will not miss that experience," Levin said during a conversation with Mark Gorenberg, chair of the MIT Corporation, at Hopkins School.
1) Don't let AI become a crutch
Levin's blunt take: students are busy, grades matter, and AI tools are everywhere-and getting better every day. That mix can push students to outsource thinking instead of building the muscle themselves.
Gorenberg's angle is similar. If students don't rewrite in their own words, they risk weak retention and shallow comprehension. AI can start the draft; the student has to finish the thinking.
- Require artifact trails: outlines, drafts, revision notes, and short reflections on how AI was used.
- Use oral defenses or quick check-ins to confirm understanding after written work.
- Design prompts that demand citation of course texts and page-specific evidence, not generic answers.
- Rotate AI-off tasks (in class) with AI-on tasks (at home) to build both independence and tool fluency.
- Grade process and clarity of thought, not just product polish.
2) Reading still matters-protect it
If AI summarizes everything, the temptation is to skip the source. That's the risk Levin sees: losing the experience of deep reading. The fix isn't banning tools; it's building reading accountability into the work.
- Use short, targeted reading checks that ask for personal interpretation or confusion points.
- Ask for annotated excerpts or margin notes as evidence of engagement.
- Shift from "What happened?" to "Why does this matter?" questions that require nuance.
- Use timed, in-class close reading with brief written responses.
3) Yes, it feels like sci-fi-and students feel it
AI can do "humanistic things" surprisingly well, Levin noted. Some people even talk to tools like ChatGPT "in a deep way." That reframes a familiar mental-health question: what does it mean to be human alongside capable machines?
Levin put it plainly: "It's a sci-fi world, but we're basically there." Gorenberg's stance: technology can go either way, so smart regulation and culture matter. "I think students are going to use (AI), and I think they should use it. And I think for good it has the chance to positively transform our society."
- Discuss AI's limits and strengths in class-bias, hallucinations, and overconfidence.
- Normalize human skills: judgment, ethics, taste, empathy, and original thought.
- Offer clear norms for AI use, including what "acceptable assistance" looks like for each assignment.
4) Invest in AI-curriculum and workflow
Gorenberg's 10-year view: AI should sit next to writing, math, and science in the core curriculum. Students will need it to function in life and work.
Levin agreed on the upside: "The AI tools are incredible in terms of unlocking knowledge." For schools, that means two tracks-teach students to use AI well, and give teachers AI that gives them time back.
- Curriculum: add AI literacy (prompting, verification, responsible use), domain-specific AI use (science labs, language support, data analysis), and assessment redesign.
- Teacher workflow: use AI for lesson outlines, exemplars, rubrics, parent emails, feedback starters, and differentiation plans.
- Policy: define permitted tools, disclosure rules, and acceptable citation of AI assistance.
- Procurement: prioritize data privacy, audit logs, admin controls, and integration with existing platforms.
5) Curiosity still wins
What Levin wants to see in new undergraduates: curiosity, risk-taking, and optimism. Students willing to try things that might not work tend to grow faster.
AI can help here. As Gorenberg noted, it lowers the barrier to start a new skill or explore a topic. "Being curious about a new skill and not saying, 'Hey I can't tackle that' - because with AI, you can."
- Encourage project-based learning where AI helps with ideation, research mapping, and early drafts.
- Have students compare their output with and without AI, then reflect on what changed.
- Use AI for scaffolding, not shortcuts-coaches, not crutches.
What to do this semester
- Add an AI use statement to every assignment, plus required disclosure of tools and prompts used.
- Adopt a 10-minute oral checkpoint for major projects to confirm understanding.
- Run a professional learning sprint: pick one AI tool, one workflow, and one assessment to redesign.
- Create a simple rubric column for "original thinking" and "evidence from course texts."
Further resources
- Stanford Institute for Human-Centered AI
- AI courses by job role (educators included) - Complete AI Training
Bottom line: teach students to think, read, and question-then let AI help them go further. Keep the human work human, and use the tools to buy back time for what matters.
Your membership also unlocks: