AI Hallucinations, Fake Citations: Deloitte's AU$440k Welfare Report Sparks Refund Push

Deloitte's welfare report used AI and included fabricated citations, prompting partial refunds and scrutiny. Agencies need disclosure, verifiable sources, and enforceable terms.

Categorized in: AI News Government
Published on: Oct 09, 2025
AI Hallucinations, Fake Citations: Deloitte's AU$440k Welfare Report Sparks Refund Push

AI-Created Citations in Deloitte Report Spark Accountability Questions for Government Buyers

The Australian government paid Deloitte 440,000 AUD (about $290,000) for a report on automated penalties in the welfare system. After publication on the Department of Employment and Workplace Relations website, a law professor flagged that the report was "full of fabricated references," including made-up case law and a suspicious citation to a non-existent book.

Deloitte reissued the report, acknowledging incorrect footnotes and references while stating the recommendations were unchanged. The updated report disclosed that Azure OpenAI was used in its preparation, and Deloitte agreed to refund part of the fee, saying the matter was resolved with the client. Senator Barbara Pocock has called for a full refund, citing misuse of AI and misquoting a judge.

Why this matters for government

Generative AI can speed up research and drafting, but it can also invent citations and misstate legal precedent. When this happens in a vendor deliverable, agencies face reputational risk, policy risk, and potential legal exposure.

This incident is a clear signal: AI use must be transparent, verifiable, and governed by enforceable contract terms. Trust is not a control. Verification is.

Red flags to watch in vendor deliverables

  • Legal citations or references that cannot be found in official databases.
  • Quotations without pinpoint citations or source links.
  • Inconsistent terminology that suggests stitched sources or auto-generated text.
  • Overconfident language with no evidence or footnotes to back claims.
  • Bibliographies with obscure publishers or titles that do not appear in catalogues.

What to do now

Minimum contract clauses

  • Mandatory disclosure: Vendors must declare any AI systems used (e.g., Azure OpenAI) and where they were used in the work product.
  • No fabricated sources: Warranties that all citations are verifiable in recognized repositories; breach triggers remediation at vendor cost.
  • Traceability: Require retention of prompts, source corpora, drafts, and change logs for audit.
  • Human review: Certify that qualified subject-matter experts conducted line-by-line checks, especially for legal or policy sections.
  • Clawbacks and penalties: Define partial or full refunds, milestone withholds, and rework timelines for noncompliance.
  • Data security: Prohibit inputting sensitive data into tools without approved data-handling controls.

Verification workflow for AI-assisted reports

  • Run automated reference checks across legal and academic databases before acceptance.
  • Sample-check quotations against source documents; reject paraphrases presented as direct quotes.
  • Require a "sources appendix" with links, DOIs, or case citations with neutral citations.
  • Perform a targeted legal review for any statement interpreting statute, regulation, or case law.
  • Use fact-check signoffs: Vendor SME, agency SME, and legal counsel each certify their section.

Oversight and accountability

  • Adopt an internal policy that mirrors the Commonwealth Procurement Rules and your risk framework for AI-assisted deliverables.
  • Establish an AI use register for vendor and internal projects, noting models, data sources, and reviewers.
  • Train contract managers and policy teams to spot AI failure modes: hallucinated citations, plausible-sounding but false legal claims, and template-driven narrative errors.
  • Set thresholds for when external legal counsel must review AI-assisted legal or policy analysis.

What government buyers should ask vendors

  • Which AI tools were used, and for which sections?
  • What retrieval or grounding methods were applied to prevent invented citations?
  • Who reviewed legal and policy claims, and what sources were checked?
  • Can you provide the prompt logs, drafts, and a reference validation report?
  • What steps will you take-and at whose cost-if issues are found after delivery?

Policy resources

Bottom line

AI can assist, but it cannot replace expert judgment or verifiable evidence. Require disclosure, build verification into contracts, and make vendors prove their sources. If a report shapes policy or informs the law, treat every citation as a control point-not an assumption.

Need to upskill teams on safe, effective AI use? Explore practical courses by role at Complete AI Training.