AI in medical education: practical paths from virtual patients to workforce-ready training
From left: Researchers Assoc Prof Liu Nan, Dr Ning Yilin and Dr Jasmine Ong explore how collaborations can contribute to unlocking the potential of AI in transforming medical education. Image credit: Duke-NUS Medical School
A new study in The Lancet Digital Health outlines how AI is set to reshape medical education - and what must happen to make it safe, effective, and equitable. The authors call for tight coordination across medical schools, hospitals, industry partners and regulators to move from pilots to proven, scalable practice.
The timing matters. Health systems face rising demand and a projected shortfall of clinicians by 2030. Smarter training pipelines are no longer a nice-to-have; they're mission critical.
What AI can do for medical training today
- Virtual patients at scale: LLM-driven case generators can produce consistent, diverse scenarios - including rare conditions - for diagnostic reasoning and communication practice.
- Procedural rehearsal with AR/VR: Simulated environments let learners practice skills like venipuncture and acute care protocols with real-time feedback.
- Metaverse classrooms: Team-based learning, case discussions and debriefs can run synchronously or asynchronously, across sites and time zones.
- Research co-pilot: Tools that assist with literature reviews, protocol drafting and data summarization can free up time for critical thinking.
- Digital co-tutor: Adaptive feedback, question generation and spaced practice help learners close gaps faster and with more precision.
Dr Jasmine Ong noted: "AI is not here to replace clinical educators and mentors, but to empower them. AI enables educators and mentors to focus on what matters most - fostering meaningful connections with their learners. Serving as a digital co-tutor, AI enhances the learning experience through personalised feedback and realistic clinical simulations, helping to shape the next generation of healthcare professionals."
Risks and constraints to manage
- Accuracy and safety: Hallucinations can mislead learners; medical content demands rigorous validation and oversight.
- Bias: Gender, racial and other biases can appear in training data and generated outputs, reinforcing disparities if left unchecked.
- Privacy: Use of patient data must comply with institutional and legal standards; de-identification and access controls are non-negotiable.
- Capability gaps: Too few qualified trainers and limited implementation playbooks slow adoption and quality assurance.
- Learning integrity: Clear policies are needed to guide appropriate use, assessment conditions and academic honesty.
Dr Ning Yilin emphasized the need for "clear guidance and inclusive, responsible design" as AI integrates into training and assessment. This means documented standards, transparent model use, and learner safeguards across programs.
Collaboration is the unlock
Associate Professor Liu Nan highlighted a simple truth: isolated pilots won't change outcomes at scale. Cross-sector partnerships can align standards, evaluation methods and funding models so that the benefits of generative AI translate into better patient care.
What leaders can do next
- Form a cross-functional coalition: Bring together deans, clinical educators, simulation leads, data protection officers, and regulators early.
- Prioritize high-yield use cases: Start with virtual patients for differential diagnosis, formative feedback for clerkships, and AR/VR procedural practice.
- Set governance and guardrails: Define acceptable use, citation norms, PHI handling, model selection, and human-in-the-loop review.
- Audit for bias and quality: Use representative cases, fairness checks, and outcome tracking across learner groups.
- Upskill faculty: Provide hands-on workshops for prompt design, evaluation methods, and classroom integration.
- Evaluate rigorously: Run controlled pilots with clear metrics (competency attainment, time-to-competence, retention, patient-safety proxies) before scaling.
- Plan sustainable resourcing: Budget for licenses, compute, instructional design support and ongoing content curation.
Why this matters
AI can extend educator capacity, standardize exposure to complex cases, and accelerate competence - without compromising safety or ethics. With the right safeguards and shared frameworks, schools and hospitals can produce more practice-ready clinicians and protect care quality.
Where to build team capability
If you're standing up AI literacy programs for educators, residents, or research staff, curated training libraries can shorten the ramp-up. See options by role at Complete AI Training.
Source: Duke-NUS Medical School (15.11.2025)
Your membership also unlocks: