Promise vs Practice: Why NHS AI Rollouts Keep Stalling
AI promises quicker diagnoses, but a UK study finds governance gaps, thin training, and tricky IT are stalling NHS rollouts. Results hinge on disciplined delivery, not hype.

NHS AI rollout: promise meets reality
AI is pitched as relief for stretched teams and faster, safer diagnoses. A new UK study shows the work is harder than the headlines: governance gaps, limited staff training, and tough IT integration are slowing progress across NHS hospitals.
The analysis, led by researchers at University College London (UCL) with partners at the Nuffield Trust and the University of Cambridge, offers a clear message for NHS leaders and policymakers: results depend on implementation discipline, not hype.
What the programme tried to do
In 2023, NHS England funded a £21m programme to deploy AI tools across 66 trusts to support chest imaging, including lung cancer detection. The tools aim to prioritise urgent cases and flag abnormalities for clinician review.
Researchers examined procurement and early deployment. Timelines stretched: contracting alone slipped by four to ten months. By June 2025-18 months after the original completion target-23 of 66 trusts still hadn't brought the tools into clinical use.
What slowed the rollout
- Clinical engagement was hard with already overloaded teams. Skepticism-especially among senior staff-centered on accountability and the need for clear human oversight.
- Training gaps left many unsure how the tools work, where they help, and where they don't.
- Integration was complex. New tools had to fit into diverse, ageing local IT setups, and those setups vary widely by site.
- Procurement was heavy and technical, increasing the chance of missing critical details.
- Governance questions surfaced early: who owns the decision, the risk, and the audit trail?
What helped
- Dedicated project management with protected time.
- Committed local clinical and operational leads.
- Shared resources and learning through imaging networks.
- Clear national programme leadership.
What to do next in your trust
Translate the findings into action. Treat AI deployment like a service change, not a plug-in.
- Set governance early: appoint clinical and operational sponsors, clarify decision rights, and formalise human-in-the-loop policies.
- Protect delivery time: assign a project manager and clinical champions; make it part of job plans.
- Start training before go-live and keep it ongoing: how the model works, where it adds value, failure modes, escalation paths, and audit. Address accountability head-on. For role-specific upskilling, consider focused resources such as AI courses by job.
- Streamline procurement: use a nationally approved shortlist when available, a common evaluation template, and predefine success metrics, risk appetite, and support needs.
- Plan integration in detail: run early technical discovery with vendors and local IT, map data flows, security, and privacy checks, test in a sandbox, then phase rollout with a clear cutover plan.
- Share and borrow: align with imaging networks to reuse configs, SOPs, and training assets.
- Measure impact: agree KPIs (reporting turnaround, backlog, accuracy/discordance, escalation rates, false positives) and review monthly.
- Include patients and carers: build communication and consent pathways; feed their feedback into iterations.
Policy signals to watch
Researchers caution that AI will support diagnostics, but it won't relieve service pressures as simply as hoped. Two priorities stand out: invest in early, continuous training and consider a national shortlist of vetted suppliers to cut procurement drag.
For context on the underlying initiatives, see the NHS AI Diagnostic Fund and the National Institute for Health and Care Research (NIHR).
The bottom line
AI can help clinicians focus on what matters most, but only if the basics are in place. Get the governance right, fund the people doing the work, integrate carefully, and train continuously. That's how promise turns into outcomes.