FCA Confirms No New AI Rules for UK Financial Services Firms

The FCA confirms no new AI-specific rules for UK financial firms, relying on existing regulations to manage AI risks. Firms must integrate AI oversight into current governance frameworks.

Published on: Sep 11, 2025
FCA Confirms No New AI Rules for UK Financial Services Firms

No Bespoke Rules for AI, FCA Confirms

Financial services firms regulated in the UK will not face new, AI-specific regulations, the Financial Conduct Authority (FCA) has confirmed. In a recent 26-page update, the FCA emphasized that it remains a “technology-agnostic, principles-based and outcomes-focused regulator.” Instead of introducing bespoke AI rules, the FCA pointed to existing regulations and guidance that already apply to the use of AI within financial firms.

This means firms must manage AI-related risks within current regulatory frameworks, rather than expect new ones specifically for AI. The FCA’s stance serves as a clear reminder that firms should already be assessing how AI impacts their controls and governance.

UK’s Sector-Based Approach to AI Regulation

The UK continues to take a distinct path from the EU’s AI Act, which introduces a tiered, AI-specific regulatory system. Instead, the UK relies on existing laws and sectoral regulation to manage AI risks. This includes data protection, consumer protection, product safety, and equality laws, applied in the context of AI.

Without an overarching AI framework, UK regulators follow five cross-sector principles:

  • Safety, security, and robustness
  • Appropriate transparency and explainability
  • Fairness
  • Accountability and governance
  • Contestability and redress

Regulators must also consider their statutory objectives, including promoting the UK’s economic competitiveness and growth. The FCA highlighted the important role technology, including AI, plays in UK financial markets and their contribution to the economy.

Implications for Financial Services Firms

Firms using AI should prepare for increased scrutiny around governance, explainability, and accountability—especially when AI affects consumer outcomes or decision-making. The FCA outlined how its existing regulatory framework relates to these principles, referencing:

  • The FCA’s Principles for Business
  • The FCA Handbook and Senior Management Arrangements, Systems and Controls (SYSC) rules
  • The consumer duty regime
  • Operational resilience requirements
  • Fair treatment of vulnerable customers
  • The Senior Managers and Certification Regime (SMCR)

Senior management has clear responsibility for AI use within their firms. The FCA stated that those subject to the enhanced SMCR regime are accountable for any AI applications related to their business areas or management functions. Reporting obligations under the consumer duty could also include considerations of AI’s impact on retail consumers.

FCA’s Use of AI and Support for Firms

The FCA is actively adopting AI for supervisory purposes like scam detection and financial crime monitoring, marking a shift toward data-driven regulation. The regulator is enhancing its use of data and technology to become more adaptive and assertive.

Machine learning has already helped the FCA tackle online scams. The regulator is particularly focused on how AI can detect complex market abuses that are hard to identify using traditional methods.

To support innovation, the FCA is establishing a system to help firms test AI models in real time. Applications for the first wave of this initiative recently closed. This system is part of the FCA’s ‘AI Lab,’ which aims to provide a safe environment for firms and stakeholders to develop and test AI technologies. Tools like access to synthetic data will help foster collaboration and proof-of-concept development.

Jessica Rusu, the FCA’s Chief Data, Information and Intelligence Officer, stated the goal is to create a space where new AI technologies can be tested responsibly.

What This Means for Professionals in Finance and Legal Sectors

The FCA’s position underscores the need for firms to integrate AI risk management into existing compliance and governance frameworks. Professionals in finance, legal, real estate, and construction sectors should keep abreast of how AI intersects with current regulatory requirements.

Understanding how AI fits within established rules on consumer protection, accountability, and operational resilience is critical. For those looking to deepen their expertise, exploring targeted AI training can provide practical skills to manage these challenges effectively.

For further learning on AI’s role in financial services and compliance, visit Complete AI Training’s courses by job.