AI, trust and the broker's next advice opportunity
SMEs are leaning into AI with interest and caution. The data says they want speed and risk insight from machines - and judgment, explanation and accountability from people. That's the opening for brokers across Australia and New Zealand.
The adoption gap is real
About a third of SMEs are exploring AI, but only 8% are actually implementing it. Adoption is much lower for micro businesses than for larger firms, signaling a gap in confidence, capability and risk awareness.
Across the Tasman, a 2024 survey found 68% of New Zealand SMEs had no plans to evaluate or invest in AI. Clients are curious but early; they need help deciding where AI fits and where it doesn't. See the latest insights in the Vero SME Insurance Index.
Where AI helps - and where clients still want people
Most businesses are fine with AI triaging information, spotting patterns, surfacing risk signals and cutting admin friction. Resistance grows as AI moves closer to the client: chatbots answering policy, renewal or claims questions raise eyebrows, and skepticism spikes if AI is framed as deciding claim outcomes.
The takeaway is simple: automation is a tool, not a substitute for accountability. As one executive put it, "People make our industry what it is and we would never want to remove the human touch from those complex areas or from the areas where the relationship comes to the fore." If you're advising on front-line automation, this overview of AI for Customer Support can help set sensible guardrails.
Transparency builds trust
"There needs to be complete transparency around where we are using it and how we are articulating that to customers," said Josh Hamill. That's both ethical and commercial. If clients can't tell where automation begins and human judgment ends, trust erodes fast.
Richard Klipin puts it plainly: "The broker is the fiduciary, the trusted partner, and you can't outsource that." Keep the line clear: AI can assist; people remain responsible for decisions - especially on coverage interpretation, placement strategy and claims outcomes.
The bigger risk conversation
Client concerns about AI aren't just about efficiency. They cut across accuracy, data security, bias, IP, governance, reputation, business continuity and professional risk. That moves the discussion beyond tools and into how AI changes a client's risk profile - and how cover, controls and contracts should respond.
What to advise - practical moves for brokers and clients
- Start with low-risk wins: Use AI for document processing, pattern-spotting, risk alerts and admin triage. Track both time saved and error rates.
- Set human-in-the-loop rules: People approve coverage placement, wording interpretation and all claims decisions. Define escalation thresholds and exceptions in writing.
- Be explicit with clients: Disclose where AI is used, what data it touches, and who is accountable. Offer opt-outs for sensitive interactions. Log AI use in the file.
- Vet vendors like you'd vet a critical supplier: Check data residency, encryption, SOC 2/ISO 27001, IP terms, indemnities, uptime SLAs and audit rights.
- Tighten controls: Role-based access, PII redaction, prompt filters, sandboxing, output checks, and kill-switches for bad outputs. Monitor and retrain where needed.
- Rehearse incidents: Model error, data leak, bias complaint, service outage. Define first-24-hour actions, evidence capture and claims notification triggers.
- Revisit cover: Cyber (privacy breach, system failure), tech E&O/PI for algorithmic errors, crime/social engineering, media/IP, and business interruption from AI outages. Watch for exclusions tied to automated decisions.
- Stand up governance: A simple AI policy, approval gates for new use cases, staff training, bias testing and explainability for any client-facing automation.
Why this matters for growth
AI will speed the back office and sharpen risk discovery. The firms that win will pair that speed with clear judgment, clean disclosures and visible accountability.
Own the high-stakes calls. Show clients how AI makes their operation safer - and where a human will always pick up the pen. For deeper, practical tools and courses, explore AI for Insurance.
Your membership also unlocks: