AI as a Ladder, Not a Crutch: How Emory Is Teaching with AI Without Losing the Human Touch

Emory treats AI as a ladder for learning-with guardrails, human in the loop, and hands-on practice. From prompt-led study to AI TAs and faster feedback, grades and access rise.

Categorized in: AI News Education
Published on: Dec 09, 2025
AI as a Ladder, Not a Crutch: How Emory Is Teaching with AI Without Losing the Human Touch

"A Ladder, Not a Crutch": How One Campus Makes AI Work For Learning

Three years after ChatGPT hit the public, AI is in almost every classroom conversation. Some fear shortcuts and cheating. Others see a chance to fix access gaps and teach modern skills. The most useful move so far: treat AI as a tool students learn with, not a tool they hide behind.

At Emory, schools and departments set their own AI policies. The throughline is simple: keep use responsible, keep a human in the loop and keep the focus on actual learning.

Clear guardrails, local control

Leadership encourages ethical use while giving professors room to build what works for their subject. That mix matters. Policy without flexibility gets ignored. Flexibility without guardrails invites problems.

Psychology: ditching the textbook for prompt-driven study

In PSYC 110, students use a three-step prompt routine instead of a $100+ textbook: ask for core concepts, ask how those ideas show up in everyday life, then request an adaptive quiz. The goal: use AI as a ladder, not a crutch.

Grades have improved, according to the instructor, and students appreciate the lower cost and faster feedback. A student learning assistant noted a different challenge: switching between classes that ban AI and classes that require it can be confusing. The fix is clarity and consistency inside each course.

Key caution from the course: pressure to be perfect drives misuse. Reduce that pressure and you reduce shortcuts. Also, teach students how to verify output to avoid false claims and made-up citations. Hallucinations are real, and they're common in newer systems. For a quick primer, see this overview of AI hallucinations from Nature here.

Business: 24/7 AI TAs built on course content

Office hours don't work for everyone. To close the gap, some business courses now use an AI TA trained on their own materials. Students can ask questions anytime and get answers based on the syllabus, slides and assignments.

The best setup so far is a hybrid: human TAs plus the AI TA. Students like the instant access and the chance to ask "basic" questions without fear of judgment. Adoption is spreading across courses, with faculty customizing chatbots to match each class.

Nursing: interactive video lectures with AI-driven questions

One nursing course uses an interactive video platform that generates targeted questions from the lecture. Students chat with the system, answer embedded questions and get guided correction. Scores on related exam items rose for students who engaged with these modules.

Reality check: building high-quality content takes real instructor time. Early adopters report hurdles, but also say the payoff is worth it when the experience improves.

Assessment: AI grading with a human check

Some instructors are using AI to speed up grading and deliver formative feedback faster. Most students prefer the immediate, growth-focused comments. But the system isn't perfect.

One effective practice: instructors audit the AI's work and invite students to flag errors. Catch a grading mistake? You don't just get it fixed; you get rewarded. That turns quality control into another learning moment.

Research and upskilling: the AI.Data Lab model

Beyond classes, a campus AI lab brings students from many disciplines together to build projects. The emphasis is collaboration, ethics and keeping a human decision-maker in every loop.

There's a healthy worry here: if students outsource too much thinking, skills decay. The antidote is deliberate practice with AI-use it, question it and make your own call at the end.

What educators can copy this term

  • State your AI policy clearly on day one. What's allowed, what's not and how to cite use.
  • Teach AI in the open. Create assignments that require students to show prompts, outputs and their revisions.
  • Pilot a course-specific chatbot. Start with FAQs and assignment clarifications trained on your materials.
  • Swap some readings for prompt-workflows. Use a 3-step routine: learn the concept, apply it to life or work, then quiz with increasing difficulty.
  • Use AI for formative feedback, not final judgment. Keep a manual review step on grades.
  • Lower the incentive to cheat. Allow resubmissions, reflective notes and partial credit for process.
  • Teach hallucination hygiene. Require source checking, link verification and citation audits on any AI-assisted work.
  • Mind equity. If a textbook costs $100+, provide AI-driven or open alternatives, plus a print copy on reserve.
  • Measure what matters. Track baseline vs. post-adoption scores and gather short student feedback after each unit.
  • Budget build time. Batch content creation, reuse prompts, and recruit learning assistants to help iterate.
  • Protect data and integrity. Use course-only training sets and document all AI use in syllabi and assignments.

Quick-start prompt flow (from the PSYC model)

  • "Explain [topic] in plain language with 3 examples."
  • "Show how [topic] shows up in everyday life for a [student's major/interest]."
  • "Create an adaptive quiz: 5 easy, 5 medium, 5 hard. After each answer, explain why."

Risks to manage

  • Incorrect answers and fake citations: require verification steps and spot checks.
  • Overdependence: center assignments on student reasoning, not AI output.
  • Inconsistent rules across courses: put your stance in the syllabus and repeat it in class.
  • Instructor workload: start small, template your prompts and iterate with student feedback.

Why this matters for your students' careers

Graduates will work alongside AI tools-especially in business, healthcare and data-heavy fields. The students who learn to question, direct and improve these tools will add the most value. That starts with purposeful use in your class.

Further resources

Bottom line: use AI as a ladder. Keep your standards. Keep a human in the loop. And design learning so students do the thinking, with AI as their assistant-not their substitute.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide