MPs warn opaque legacy IT estate stalls Whitehall's AI ambitions

Government can't quantify legacy IT, and patchy transparency is already slowing AI. PAC warns 28% of systems are legacy; map risks, decommission, and meet standards before pilots.

Categorized in: AI News Government
Published on: Oct 17, 2025
MPs warn opaque legacy IT estate stalls Whitehall's AI ambitions

Legacy IT blind spots are blocking AI across government

Two House of Commons committees raised the same concern this week: government cannot confidently quantify its legacy IT estate, and that gap is already slowing AI adoption. Transparency on spend and assets is patchy, and definitions vary across departments.

The result is predictable: risk builds, costs rise, and AI pilots stall on brittle systems that can't scale or integrate.

Transparency gap: policy intent vs. reporting reality

In an evidence session on 14 October 2025, the Science, Innovation and Technology Committee heard that connecting contract data to spending data remains inconsistent. Requirements under the Procurement Act to publish full contract details over £5m within 90 days are helping, but compliance is uneven.

Use of the act's notice types is rising month by month, yet some authorities are publishing fewer notices than before. That could mean non-compliance or a choice not to publish voluntarily, but either way, visibility suffers. See the Procurement Act guidance on GOV.UK.

How much legacy IT is out there?

Crown Hosting Data Centres described a continued reliance on older server rooms and fragmented estates. Its remit is straightforward: consolidate, cut energy use, reduce carbon, and lower cost. It estimates public sector savings of £1.5bn a year from moving legacy kit into efficient facilities.

A recent example: the Department for Work and Pensions shifted data and applications into a private cloud arrangement within Crown Hosting as part of a broader cloud programme. DWP reports 70% of services now in public cloud, with £150m annual savings on legacy IT and a nine-month return on investment.

The definition problem is masking the scale

Departments must report energy consumption for IT each year to HM Treasury, with Defra analysing carbon figures. Those figures are treated as "legacy IT" because hyperscale cloud providers don't return their carbon numbers in that process.

That framing effectively classifies all non-public cloud assets as legacy, regardless of age or support status. It simplifies reporting but blurs risk. New on-prem systems can get lumped in with out-of-support infrastructure, which distorts priorities.

PAC warning: legacy is choking AI and raising cyber risk

The Public Accounts Committee's report (15 October 2025) calls out "out-of-date legacy IT infrastructure" as a major obstacle to AI across government and a driver of cyber exposure. Using DSIT's definition-end-of-life, out of support, impossible to update, uneconomic, or above risk threshold-an estimated 28% of central government systems were legacy in 2024.

DSIT still cannot state how many legacy assets exist across government. PAC calls that unacceptable, and urges urgent funding to remediate the highest-risk technology. See DSIT's policy direction in A blueprint for a modern digital government.

What this means for departments

  • Stand up a single, living register of business-critical systems, mapped to owners, contracts, dependencies, data flows, and security posture. Do not wait for the centre to do this for you.
  • Classify "legacy" by DSIT risk definitions, not by hosting model. Distinguish between out-of-support tech vs. modern on-prem or private cloud assets.
  • Tie every system to a measurable cost profile: run cost, change cost, outage cost, and carbon. Use this to rank remediation work.
  • Set a hard standard: no AI pilots on systems without current support, tested APIs, data quality controls, and audit logging. Otherwise, pilots stall in integration work.
  • Prioritise decommissioning. Every system retired frees money and attention for AI-ready data and platforms.
  • Use the Procurement Act to force transparency: publish notices on time, require open interfaces, and push for outcome-based clauses that fund decommissioning alongside new capability.
  • Adopt a reference architecture for AI: event-driven data pipelines, common metadata, model registries, MLOps, and zero-trust access by default.
  • Exploit carbon reporting wisely. Use energy data as a signal, not the definition. Validate with support status, patch cadence, and vulnerability history.
  • Ringfence budget for the top 10 high-risk systems per department. Report quarterly on status, spend, and risk burn-down.
  • Build skills where it matters: contract management for modern software, data engineering, security architecture, and model governance.

Immediate actions for the next 90 days

  • Publish a department-wide legacy heatmap: systems, risk rating, owner, and retirement path.
  • Freeze new AI pilots that touch red-rated systems until minimum integration and security standards are met.
  • Issue a commercial addendum requiring vendors to provide complete SBOMs, support timelines, and exit plans.
  • Kick off two decommissioning sprints targeting quick wins that remove fees and reduce attack surface.

Upskilling your team

If you're building internal capability for AI project leads, data stewards, and governance teams, see role-based options here: AI courses by job. Focus training on data quality, API integration, model risk, and audit requirements specific to public services.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)