357 AI tools, patchy policies: NSW auditor urges agency-specific rules and central inventories

NSW uses hundreds of AI tools, but statewide rules aren't enough. The auditor urges agency-specific policies, live inventories, risk checks, and human oversight.

Categorized in: AI News Government
Published on: Nov 03, 2025
357 AI tools, patchy policies: NSW auditor urges agency-specific rules and central inventories

NSW auditor: Extra agency-led AI policies needed

Artificial intelligence is now widely used across NSW Government. A recent audit found 357 different tools in play across 21 of the state's 26 largest departments and agencies.

Use cases span workflow support, customer interaction, fraud detection, cyber security, compliance monitoring, capability development, and service delivery. Not all tools are live; some are still in pilot.

Whole-of-government rules aren't enough

The audit warns that statewide principles don't go deep enough for agencies that are moving into advanced or higher-risk use. As the report puts it: an agency-level policy is required, and there is no one-size-fits-all model.

Less than half of the largest agencies had a formal AI policy or had embedded AI into existing governance frameworks. Several said their policy was under review, while others leaned solely on the statewide AI ethics principles.

Inventories are critical for accountability

The report encourages central registers that document the purpose, use, and limits of each AI tool. Fifteen of the 21 largest departments and agencies said they were already doing this at the time of review.

What departments should do next

  • Publish an agency AI policy that sets acceptable use, risk tiers, and approval pathways.
  • Maintain a single inventory: purpose, owners, data sources, model type, supplier, pilot/production status, risk rating, review dates, and known limitations.
  • Require pre-implementation impact assessments covering privacy, security, fairness, accessibility, and legal basis.
  • Keep a human in the loop where rights, entitlements, or enforcement decisions are affected. Provide an override.
  • Test and monitor models: accuracy, drift, bias. Define thresholds, alerts, and escalation routes.
  • Tighten data controls: minimisation, audit logs, retention, red-teaming for prompt/output risks.
  • Lock in procurement clauses: training data origin, IP and confidentiality, incident reporting, SLAs, right to audit, decommissioning obligations.
  • Control "shadow AI": approved tool catalogues, rules for generative features in office suites, and clear data leakage guidance.
  • Assign accountability: named product owner, risk owner, and senior executive sponsor.
  • Train staff with practical do/don't guidance and agency-specific examples.
  • Plan for failure: rollback paths, kill switch, incident response, citizen complaint and redress flows.
  • Report publicly where appropriate: publish your inventory and risk ratings to build trust.

Why this matters

AI is no longer a side project. With hundreds of tools in use, decisions made by or with AI need clear guardrails, documented purpose, and visible ownership. Agency-specific governance closes the gap between high-level ethics and day-to-day operations.

Resources

Read more from the Audit Office of New South Wales and review NSW's digital policy settings via Digital NSW.

Skill up your team

If you're building an agency policy or rolling out training at scale, explore practical AI courses by role at Complete AI Training.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)