The FCA's Mills Review on AI in retail financial services: what finance leaders should prioritise now
12 February 2026 | 1 minute read
On 27 January 2026, the UK Financial Conduct Authority (FCA) launched the Mills Review to examine how AI could affect retail financial services through 2030 and beyond. The scope covers market structure, firms' operations, consumer behaviour and how regulators may adjust in response. This follows rising political and parliamentary interest in AI use across finance.
What the Review will focus on
- How AI could develop over the next several years.
- How those developments may affect markets and firms.
- Impacts on consumers, including benefits and risks.
- How regulators may need to evolve to keep retail markets working well.
Agentic AI: new interfaces, new pressures
The FCA flags agentic AI as a key area. By 2030, many consumers may use AI-mediated interfaces to interact with financial services instead of engaging directly with firms.
That shift can improve access and speed, but it could compress margins on traditional advice, raise suitability and transparency questions, and drive herding if similar AI systems steer users in the same direction.
Risk and governance across the full tech stack
The FCA frames AI risks within a broader technology ecosystem. AI is a cross-cutting policy issue that touches model risk, data quality, operational resilience and third-party dependence. Technology leaders should align operational controls with board and senior manager responsibilities and consider targeted training such as the AI Learning Path for CIOs to support governance, model validation and resilience planning.
- Map all AI use cases across the product and customer lifecycle, including experimental tools and shadow IT.
- Clarify Senior Manager accountability and audit trails for AI-enabled decisions (including overrides and escalation paths).
- Tighten third-party and Critical Third Party oversight; test failover and exit plans.
- Strengthen model validation, monitoring, explainability and bias testing; document data lineage and consent.
- Run resilience exercises for AI outages, model drift and prompt injection incidents.
Competition and potential market power
As consumers delegate more decisions to AI agents, firms may launch new value propositions. But concentration risks can emerge if AI providers favour certain firms or if deep personalisation locks consumers in. The FCA is looking across the AI value chain, including actors outside its current perimeter.
- Plan for distribution through AI agents as well as direct channels.
- Push for interoperability, data portability and clear switching paths to reduce lock-in.
- Monitor partner terms for preferential ranking, bundling or self-preferencing risks.
- Build contingencies if a major AI provider limits access, alters pricing, or changes APIs.
Regulatory approach: outcomes first, adaptations likely
The FCA is not proposing new prescriptive rules at this stage. Instead, it is assessing how outcomes-based regimes may need updates as AI changes the pace and scale of activity: Consumer Duty, the Senior Managers and Certification Regime (SM&CR), Operational Resilience and the Critical Third Parties regime. Regulatory teams and compliance specialists may find practical guidance and role-specific training in the AI Learning Path for Regulatory Affairs Specialists to help translate the Review's expectations into governance and risk frameworks.
- Map Consumer Duty outcomes to AI-driven journeys, disclosures and nudges; test with real users at risk of harm.
- Update responsibility maps and reasonable steps documentation for AI governance under SM&CR.
- Refresh impact tolerances and playbooks for AI-specific incidents, including third-party failures.
- Evidence continuous oversight: metrics, thresholds, triggers, and independent challenge.
Deadline: contribute by 24 February 2026
The FCA has invited input from stakeholders by 24 February 2026. Firms should use this window to bring concrete evidence and suggestions.
- Provide data on customer outcomes, error rates, overrides, complaints and switching frictions linked to AI features.
- Share results from consumer testing across different segments, especially vulnerable customers.
- Propose proportionate safeguards, audit trails and disclosure standards that work in live operations.
- Flag areas where guidance would reduce uncertainty without blocking useful innovation.
- Coordinate with trade bodies to avoid fragmented responses.
For background on current expectations, see the FCA's pages on the Consumer Duty and the Senior Managers and Certification Regime.
Your membership also unlocks: