MOH to share healthcare AI lessons with National AI Council - what it means for clinicians and leaders
Singapore's Ministry of Health (MOH) will contribute its healthcare AI experience to the new National Artificial Intelligence Council, chaired by Prime Minister Lawrence Wong. Health Minister Ong Ye Kung said MOH has been using AI for some time with a case-centric model that keeps clinicians firmly in the loop.
His stance is clear: AI should support doctors, nurses, and allied health professionals to work better, not replace clinical judgment. The goal is better outcomes for patients and more meaningful, sustainable work for care teams.
Key context
The National AI Council, announced during Budget 2026 on Feb 12, will drive AI missions in four sectors: advanced manufacturing, transport connectivity, finance, and healthcare. As a member of the inter-ministerial council, Mr Ong said MOH will share lessons learned and further improve how AI is used across care settings.
"Ultimately, using AI will help to improve healthcare, benefit patients and make the jobs of our clinicians better," he noted.
What "case-centric, clinician-in-the-loop" looks like in practice
- Start from the patient case, not the algorithm. Define the clinical decision, workflow gap, or safety risk first.
- AI augments, clinicians decide. Models propose; care teams review, override, and document rationale.
- Local context matters. Calibrate and validate on Singapore populations, care pathways, formularies, and coding standards.
- Measure impact continuously. Track accuracy, timeliness, patient outcomes, safety signals, and workload changes.
High-value use cases to prioritise in the next 12 months
- ED and polyclinic triage support: risk stratification, sepsis flags, chest pain pathways, falls risk.
- Medical imaging: prioritisation worklists, quality checks, assistive reads for common findings.
- Clinical documentation: ambient scribing, note summarisation, discharge instructions aligned to local languages.
- Care coordination: case summaries across institutions, follow-up task automation, referral quality checks.
- Medication safety: interaction checks, duplicate therapy flags, dose range oversight for paediatrics and renal dosing.
- Bed flow and theatre scheduling: predictive lengths of stay, cancellation risk, resource balancing.
- Population health: risk registries, outreach targeting, remote monitoring signal triage.
- Patient communications: AI-assisted FAQs, appointment management, wayfinding, symptom advice with escalation rules.
Guardrails that matter in Singapore
- Accountability and explainability: ensure auditable decisions, human override, and clear incident response.
- Data protection and consent: align to PDPA and institutional policies; use de-identification and role-based access.
- Bias and clinical safety: monitor subgroup performance (age, sex, ethnicity, comorbidities); run prospective safety pilots.
- Regulatory and ethics review: involve IRB/ethics where applicable; classify tools appropriately (clinical vs non-clinical).
- Cybersecurity and supply chain: vet vendors, isolate models where needed, log prompts and outputs for forensics.
- Staff training: educate clinicians on model limits, red flags, and documentation standards.
Helpful references: Singapore's Model AI Governance Framework (PDPC) and WHO guidance on AI for health.
Data and technical foundations
- Interoperability: HL7 FHIR, SNOMED CT, LOINC; minimise free-text silos.
- Quality datasets: representative cohorts, clinician-verified labels, continuous drift checks.
- Safe deployment: phased pilots, shadow mode, clear stop-go criteria, post-market surveillance.
- Integration: embed into EMR/ordering/imaging workflows; avoid app sprawl.
- MLOps: versioned models, prompt governance for generative AI, monitoring, rollback plans.
90-day action plan for healthcare institutions
- Pick two use cases: one clinical (e.g., imaging triage), one operational (e.g., bed flow).
- Form a small squad: clinician lead, nursing lead, informatics, data scientist, risk/compliance, and IT security.
- Define success: target metrics, patient safety thresholds, and decision rights for overrides.
- Map data: availability, quality gaps, and privacy constraints; plan de-identification where possible.
- Pilot safely: run in shadow mode, compare against standard of care, log discrepancies, and refine.
- Report and scale: share outcomes with hospital leadership and MOH channels; prepare a playbook for replication.
Community note
Mr Ong shared these updates while attending the opening of Gallop.SG's new stable at Admiralty Road East. The event brought together residents, including 100 invited low-income families, reinforcing the link between community support, prevention, and long-term health.
Where to build capability next
For teams planning structured upskilling and templates for safe, case-centric deployments, explore AI for Healthcare.
Healthcare will remain a priority sector under the National AI Council. With clinicians in the driver's seat and clear guardrails, AI can reduce friction in daily workflows and raise the ceiling on care quality for patients across Singapore.
Your membership also unlocks: