Deloitte to refund part of $440,000 fee after AI-driven errors in government report
Deloitte Australia is returning part of its $440,000 fee after the firm delivered an AI-assisted report to the Department of Employment and Workplace Relations (DEWR) that contained multiple inaccuracies.
The report on the IT system behind the Targeted Compliance Framework included non-existent academic references and a fabricated quote attributed to a Federal Court judgment in Amato v Commonwealth (2019). Academics flagged the errors soon after the report's initial release in July, pointing to AI "hallucinations" as a likely cause.
What happened
DEWR commissioned an independent assurance review of the system that applies to people required to meet mutual obligations. After concerns surfaced, Deloitte investigated and issued a corrected report. The department posted the revised version on 3 October 2025 and noted that the original-dated 4 July 2025-contained citation and case summary errors.
The update states that references were corrected, the Amato case summary was amended, and clarity improvements were made. Deloitte has not commented publicly. DEWR has confirmed the firm will provide a partial refund.
Why this matters for public officials
AI can draft quickly, but it can also invent sources and misquote legal decisions. When such output slips into official work, it risks legal exposure, policy confusion, and public trust.
Even when a report's "findings" remain unchanged, citation errors can undermine its credibility and your ability to rely on it in decision-making or litigation.
Immediate steps for agencies commissioning AI-assisted work
- Require vendors to disclose where and how AI tools are used in analysis, drafting, and evidence gathering.
- Mandate human expert verification of all legal citations, case law summaries, statistics, and quoted sources before submission.
- Run a citation audit: sample-check references, follow links, and confirm page numbers and quotes match originals.
- Insist on full source transparency: provide access to datasets, case extracts, and academic papers relied upon.
- Add internal checks: legal, policy, and technical reviewers sign off on separate sections with clear accountability.
Procurement and contract implications
- Embed AI-use clauses in contracts: disclosure, data handling, human oversight, and responsibility for inaccuracies.
- Tie payments to quality gates: acceptance only after citation and fact verification checks pass.
- Include correction and refund provisions for errors discovered post-publication, plus audit rights.
- Apply performance history: weigh prior conduct and quality issues in future evaluations, per the Commonwealth Procurement Rules (CPRs).
Legal and policy context
The disputed content included a misrepresentation of the Federal Court's Amato decision, a key case in compliance and automation debates. If your work references Amato, verify against the primary judgment to avoid secondary-source drift (Amato v Commonwealth [2019] FCA 1133).
DEWR has stated that reviews are underway to ensure decisions are lawful and sound. The department says the updated report does not change the substantive findings or recommendations, and continues to work through those recommendations.
Political and sector fallout
The Greens have called for Deloitte to repay the full $440,000 and urged the government to bar firms from future work if they act unethically or deliver poor quality. This push follows the PwC confidentiality scandal, after which the government still allows the firm to bid for public contracts.
Expect stronger scrutiny of consultancy outputs, tighter procurement terms, and more explicit AI governance requirements in future engagements.
What you can do this week
- Review current consultancy contracts for AI disclosure, verification steps, and refund/correction clauses.
- Set a standing requirement: every legal or academic reference in external reports must be validated against the primary source.
- Brief your evaluation panels to probe vendors on AI guardrails, human review, and evidence traceability.
- Upskill internal reviewers on spotting AI artifacts (fabricated citations, vague sourcing, inconsistent terminology). If you need structured options, see AI courses by job role.
Bottom line
AI can speed up drafting, but it cannot replace accountable expertise. Government teams must demand source transparency, enforce verification, and link payment to quality. Vendors who use AI without safeguards take on clear risk-and should bear the cost when errors land on the public record.