Australians turn to AI and peers as trust in digital services lags
Australians are deeply connected online, yet many avoid official channels for key life moments. The 2025 Digital Citizens Report: Bridging the AI Gap from Publicis Sapient shows a clear split between daily digital habits and how citizens engage with government platforms.
The signal for government teams is direct: experience, awareness, and trust are the blockers. AI use is rising fast, and people expect the same speed and relevance from public services that they get elsewhere.
Where citizens go first
Only 34% start with government websites for major life events like having a baby, enrolling in school, or retiring. Instead, 55% ask friends and family, and 42% use Google. Just 18% used a government service via app or phone.
32% say government services are not top of mind. Many in vulnerable groups are active online but still find public platforms hard to use.
Publicis Sapient's Steven Metzmacher said citizen behavior is moving faster than government tech. He urged a shift from static pages to structured, machine-readable platforms that AI systems can interpret, so people and tools can find answers quickly.
AI is already part of the journey
51% of Australians use generative AI daily. They use it for image creation (24%), education (21%), and news and current events (22%). 21% already use generative AI to find information about government services, and the top use case for AI users is Q&A and information seeking (42%).
Metzmacher also highlighted AI-driven personalisation to reduce friction and meet people where they are. The key levers: better experience, stronger awareness, and trust built through clear, helpful communication.
What people expect from digital government
57% say personalisation would increase usage. 67% want a single entry point to access services across government. Among those using in-person channels, 60% would prefer simple online access if it existed.
Data concerns are still high. Only 24% are fully comfortable sharing data across agencies, though 37% would do so if it improved their experience. 89% want transparency in AI use, with 45% calling for public access to source code. Just 11% say they fully trust AI in government contexts, and 49% want clear regulations and visible safeguards.
Risks on the table
Daily use of generative AI rose from 40% in 2024 to 51% in 2025. The biggest concerns: privacy breaches, misinformation, and scams (45%). Citizens support the use of AI, but they want ethical guardrails, transparency, and explicit consent.
Policy context
Angela Robinson of Publicis Sapient Australia noted that better digital services are about access to support that fits how people live and work now. At the Economic Reform Roundtable, an AI plan for the Australian Public Service was named a top priority for long-term growth.
Her message: people already rely on AI to complete everyday tasks, including finding government information. Public platforms must be easier to find, simpler to use, and built around the tools people already use.
What government teams can do next
- Make content machine-readable: Use structured data (e.g., schema), clean information architecture, and open APIs so both people and AI systems can retrieve the right answers fast.
- Build AI-ready search and answers: Provide concise, source-linked responses; publish service definitions and FAQs in consistent formats; expose knowledge via endpoints that assistants can query.
- Design for life events: Build clear journeys for major moments (birth, school, retirement). Reduce steps, show eligibility early, and remove redundant forms.
- Personalise with consent: Offer opt-in profiles, save progress, and use event triggers. Make preference and data controls visible at every step.
- Create a single front door: Implement federated search across agencies, unified sign-in, and consistent UI patterns so people don't have to relearn each site.
- Build trust into the interface: Disclose where AI is used, show reasoning or sources, and provide non-AI paths. Publish model, data, and safeguard details in plain language.
- Privacy by design: Minimise data collection, explain why data is needed, and offer clear choices. Log access, allow audit, and align with the Australian Privacy Principles (OAIC guidance).
- Raise awareness where people already search: Improve SEO for life events and publish structured snippets. Ensure content is easy for AI systems to reference accurately.
- Test with vulnerable groups: Run moderated sessions, measure task success and time-to-answer, and fix blockers quickly. Track completion rates over impressions.
- Upskill the workforce: Train staff on prompt literacy, AI-assisted writing, and model governance. For practical learning paths by role, see Complete AI Training.
- Meet the standard: Align services to the Digital Service Standard and publish conformance status publicly.
About the research
Findings are based on an online survey conducted in April 2025 with 5,250 participants across Australia, representative of the national demographic profile.