AI use outpaces governance in NSW councils, sparking call for mandatory rules

NSW's audit office wants OLG to set a mandatory AI framework before tools take hold. Without shared rules, practice splinters, accountability blurs, and residents carry the risk.

Categorized in: AI News Government
Published on: Mar 02, 2026
AI use outpaces governance in NSW councils, sparking call for mandatory rules

NSW councils need mandatory AI governance before tools take hold

The Audit Office of New South Wales has called for the Office of Local Government (OLG) to set a mandatory framework for responsible AI use across councils. Unlike federal and state agencies, councils aren't bound to follow existing AI guidelines. That gap puts ethical standards, institutional integrity, and resident rights at risk.

The message is clear: build the guardrails before scaling the tools. Without shared rules, councils will face inconsistent practice, unclear accountability, and higher downside risk.

Why a mandatory framework matters

Federal and NSW AI frameworks exist, but councils don't have to use them. A minimum, mandatory baseline from the OLG-drawing on the NSW AI Assurance Framework and Australia's AI Ethics Principles-would create consistency across the sector.

That baseline would set expectations for risk assessment, accountability, transparency, privacy, and security. It would also give councils a single reference point for procurement and oversight.

Where councils stand right now

Only 40% of NSW councils have a formal AI policy. Just 11% have an adoption strategy. Ninety councils reported using 109 AI tools (pilot or fully implemented), but the true number is likely higher because fewer than half track their AI tools.

Most reported use is generative AI for productivity and workflow. Other uses include:

  • Resident and developer support
  • Recruitment assistance
  • Cybersecurity support
  • Asset quality and maintenance assessments
  • Waste contamination identification

Adoption is growing: 51% have tools in planning or development.

The oversight gap

Only 10% of councils have a central inventory of AI tools. As more AI arrives through pilots and procured software, visibility drops. Without a live register, executives can't confirm the right controls are in place-or prove it to the community.

The consequence: fragmented efforts, missed benefits, and unmanaged risk (strategic and operational).

What councils can do in the next 90 days

  • Assign ownership: name an accountable executive and an AI lead with authority to approve or stop deployments.
  • Publish an interim AI policy: set principles, risk thresholds, approval gates, and clear lines of accountability.
  • Stand up a central inventory: record every AI use (including features inside procured systems), owners, purpose, data used, risk rating, and status.
  • Require pre-implementation checks: privacy impact assessment, security review, legal review, data provenance, and basic bias/accuracy testing appropriate to risk.
  • Update procurement: include requirements for transparency, audit rights, model/version disclosures, incident notification, data handling, and exit/portability.
  • Set usage rules: staff guidelines for prompts, sensitive data handling, records management, and acceptable use.
  • Track operations: log incidents, model changes, and significant performance shifts; define escalation paths.
  • Pilot with intent: define success metrics, resident impact, and kill-switch criteria before pilots start.
  • Close the loop with residents: publish high-level use cases, provide contact points for complaints, and outline redress options.
  • Integrate with existing governance: link AI risk to enterprise risk, cybersecurity, privacy, and audit plans.

What the OLG can do now

  • Set minimum, mandatory controls: policy, inventory, risk assessment, procurement standards, transparency, and auditability.
  • Scale by risk: lighter touch for low-risk productivity tools; tighter controls for decisions affecting services, rights, or safety.
  • Provide shared resources: templates, model clauses, risk matrices, and a simple reporting format to cut duplication and cost.
  • Coordinate capability: shared training, sample test datasets, and a community of practice for councils.

Budget pressure is real

Local Government NSW warns that a mandated framework could create financial barriers for smaller councils. A principles-based core with tiered requirements can help. Shared templates, central guidance, and training support can reduce cost and lift consistency.

For teams building skills while policy matures, see AI for Government for practical training and governance resources.

Metrics that keep you honest

  • Coverage: percentage of AI use cases recorded in the inventory (target: 100%).
  • Compliance: percentage of use cases with completed risk and privacy assessments before go-live.
  • Quality: number of incidents, complaints, and significant model drifts per quarter-and time to resolution.
  • Value: measured hours saved or service-level improvement for each approved use case.

Bottom line

Governance first, tools second. With a clear baseline, councils can move faster, reduce risk, and protect residents-without creating a two-speed sector where only the well-resourced can proceed.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)