Can you rely on AI for financial advice?
AI is now stitched into daily workflows across research, reporting, and client comms. But as a source of financial advice, it still falls short.
Recent UK testing shows the gap between slick answers and sound guidance. For finance pros, that gap means compliance risk, client risk, and brand risk.
What the latest tests showed
Consumer group Which? tested six tools - ChatGPT, Google Gemini, Gemini AI Overview, Microsoft Copilot, Meta AI and Perplexity - across 40 questions in September 2025. They found inaccurate, unclear, and risky responses that could cost users if followed.
Simple trap: they asked how to invest a £25,000 ISA allowance. ChatGPT and Copilot didn't flag that the annual limit is £20,000, effectively enabling oversubscription - a breach of HMRC rules. In another test on tax codes and refunds, some tools surfaced premium refund services alongside the free government route.
Separate research by Investing Insiders ran 100 questions across savings, housing, and retirement. Results: 56% correct, 27% misleading, 17% wrong - meaning 44% of answers weren't useful for people seeking advice.
Why this matters for finance professionals
AI tools aren't regulated by the FCA. If a client follows an AI-generated suggestion, there's no accountability, no suitability assessment, and no recourse.
As David Horowitz put it: using a general AI as your adviser is like Googling symptoms and skipping the doctor. It doesn't know goals, tax position, time horizon, or risk tolerance - and it can't take responsibility.
The other side: user error is real
Some failures aren't model limits; they're prompting and tool choice. As AI strategist Mitali Deypurkaystha argues, asking a general model to act like a specialist adviser is the GP-vs-surgeon mistake.
The fix isn't "don't use AI." It's: use it the right way, verify the source, and keep a human in the loop - especially for decisions with tax or regulatory consequences.
Where AI can help right now
- Research acceleration: pull current rules and summarize from authoritative sources (HMRC, FCA, legislation).
- Scenario drafting: create what-if comparisons from inputs you supply (contribution levels, tax year, age, marginal rate) - then validate.
- Client communication: first drafts for emails, summaries, FAQs; you edit for accuracy and tone.
- Data extraction: parse PDFs and policy docs to surface key figures for manual review.
Where AI should not be the decider
- Suitability, personal recommendations, or anything that constitutes regulated advice.
- Tax-sensitive moves: pension transfers, allowance planning, CGT optimizations, ISA subscription decisions.
- Edge cases: multiple income sources, non-dom issues, trust structures, relief interactions.
- Facts that change often: allowances, thresholds, codes, and eligibility criteria.
Practical guardrails for safe use
- Force citations from specific domains (e.g., gov.uk, legislation.gov.uk, fca.org.uk) and click through to confirm.
- State the tax year and jurisdiction in every prompt. Ambiguity invites hallucination.
- Give precise inputs: income bands, ages, contribution levels, time horizon, constraints.
- Add a constraint: "If unsure or data is missing, say you don't know."
- Cross-check critical claims on the source page before sharing with clients.
- Keep an internal log of AI errors to refine prompts and decide where human-only review is mandatory.
Fast fact checks you can bookmark
- Current ISA rules and limits: gov.uk/individual-savings-accounts
- What counts as "advice" vs information: fca.org.uk/consumers/what-advice
A workable model for your team
- Use AI for speed: research, first drafts, scenario outlines, comparisons.
- Keep a human for decisions: suitability, recommendations, and sign-off.
- Document sources and assumptions every time you use AI output in client work.
- Train your staff on prompting, verification, and compliance boundaries.
Want structured upskilling?
If you're formalizing AI use in a finance function, build capability before you scale. This list of tools is a useful starting point: AI tools for Finance.
Bottom line
AI is a powerful assistant, not a licensed adviser. Use it to save time and widen your thinking, but make sure a regulated professional owns the judgment, the context, and the accountability.
The upside is real - so are the consequences of getting it wrong. Keep AI on a leash and your compliance clean.
Your membership also unlocks: