AI Errors Force Deloitte Refund in Australia as UK Audit Scrutiny Intensifies

Deloitte will issue a partial refund after AI-assisted citation errors in an Australian government review; a corrected version is now online. Findings unchanged; refund pending.

Categorized in: AI News Government
Published on: Oct 07, 2025
AI Errors Force Deloitte Refund in Australia as UK Audit Scrutiny Intensifies

Deloitte to issue refund after AI-linked errors in Australian government report

Australia's Department of Employment and Workplace Relations (DEWR) says Deloitte will provide a partial refund after an "independent assurance review" included incorrect citations that were produced with AI assistance. The report was commissioned for about $439,000 to assess issues with a welfare process that automatically penalized jobseekers. The refund amount remains confidential until the process is finalized. A corrected version of the review has been posted on the department's site.

Deloitte acknowledged inaccurate footnotes and references in the original document. The errors included citations to non-existent reports, including work attributed to academics at the University of Sydney and Lund University. The government says the review's core findings and recommendations are unchanged and the contract will be made public after the transaction concludes. Deloitte has not stated that AI directly caused the mistakes but confirmed it resolved the matter with the client.

Why this matters for government teams

Consultancies are increasingly using AI to draft, research, and assemble evidence. That raises the risk of "hallucinated" citations slipping into official work if verification is weak. In June, Big Four firms were criticized for poor oversight of automated tools and AI in audit quality. The lesson is simple: if AI touches the work, verification must be explicit, funded, and enforced.

What happened

  • Commission: ~$439,000 independent assurance review, ordered in December last year.
  • Purpose: Help DEWR assess problems with a welfare system that automatically penalized jobseekers.
  • Issue: Faulty references and footnotes, some pointing to non-existent academic reports.
  • Status: Corrected review published; refund agreed in principle; amount pending finalization.

Immediate actions for public sector buyers

  • Require AI disclosure: Vendors must declare if and how AI is used (research, drafting, analysis). Prohibit undisclosed AI in deliverables.
  • Set acceptance criteria: Tie payment to citation accuracy, evidence traceability, and error thresholds. Use holdbacks to enforce quality.
  • Mandate verification steps: Independent spot-checks of 20-30% of citations before acceptance. Reject unverifiable sources.
  • Demand an evidence pack: Full bibliography with working links/DOIs, access dates, and source files. No pay without it.
  • Insist on an audit trail: Change logs, version history, and (where feasible) prompt/output logs to show who did what and when.
  • Check vendor controls: Ask for their AI governance, human-in-the-loop process, and pre-publication fact-checking workflow.
  • Protect data: Ban use of sensitive or unpublished government data in external AI tools without written approval.
  • Stage reviews: Add an early draft checkpoint focused only on sources and references to catch problems before final delivery.

Contract language to add now

  • Disclosure clause: Vendors must disclose all automated tools used and obtain approval for generative use.
  • Accuracy warranties: Vendors warrant that citations are valid and accessible. Include right to public correction and rework at no cost.
  • Remedies: Financial penalties and partial refunds tied to error rates, plus the right to terminate for repeated failures.
  • Records retention: Require retention and provision of working papers, research notes, and source lists for a set period.

Budget and timelines

Allocate 5-10% of project value for independent verification of sources, facts, and analysis. Build time for source validation before final sign-off. Make early, narrow reviews (citations and evidence only) part of the schedule.

Deloitte faces UK scrutiny as well

Separately, the UK's Financial Reporting Council (FRC) is investigating audits of Stenn, a fintech that collapsed in December 2023. U.S. authorities have linked the company to a money laundering case involving its founder, Greg Karpovsky, who denies wrongdoing. The FRC will review audits from 2017-2023 related to Stenn Assets UK Limited and Stenn International Limited. Deloitte became Stenn's auditor in 2023, following Azets; EY resigned in 2018 citing concerns about related party transactions and management explanations. Deloitte and Azets say they will cooperate fully with the FRC.

Where to find official updates

Build in-house capability

Don't outsource your accountability. Stand up a small internal review cell that can verify sources, sample-test vendor outputs, and enforce standards across departments. If your team needs structured upskilling, explore role-based learning paths for public sector work.

Bottom line

AI can help, but government work demands verified evidence. If you buy AI-enabled deliverables, fund verification, enforce it in the contract, and hold vendors to it with money on the line.