Data Sovereignty and AI Compliance in Financial Services
The collision of AI's compute needs and financial regulation is forcing a rethink of data centre design. AI is moving fast in capital markets, banking, and insurance, but you can't just add GPUs and hope for the best. Goldman Sachs Research expects AI to drive roughly 27% of data centre energy demand by 2027. For finance leaders, the real challenge is keeping every kilowatt and every byte inside strict regulatory lines.
Regulators now look straight at your infrastructure
The Digital Operational Resilience Act (DORA) changed the game across the EU in January 2025. It gives supervisors direct oversight of critical ICT providers that support regulated firms, including some cloud and colocation operators. You can read the regulation summary here: Digital Operational Resilience Act (DORA).
Financial institutions are tightening on-site reviews and writing in audit rights for critical suppliers. Expect technical site visits that validate electrical and mechanical backups, physical security, access control, incident response, and data protection measures.
Operators, in turn, need audit-ready documentation and monitoring that satisfies uptime, privacy, and incident reporting obligations. The push for digital sovereignty means verifiable data residency, strict data locality controls, and technical safeguards that block cross-border transfers by default.
High-density AI meets compliance constraints
AI training racks now draw 50-100 kW each-around 10x traditional finance workloads. That demands advanced cooling-rear-door heat exchangers, direct-to-chip liquid cooling-without compromising availability or security obligations.
Paul Ju, Senior Vice President of Global ISG at ASUS, notes that from hardware to software, their teams are helping public sector and financial clients adopt sovereign AI models that stay under local control.
External model usage adds pressure. Many services require sending sensitive inputs to third-party LLMs. This is driving demand for regulated cloud options that keep data local while still giving access to high-end compute.
Architecting compliant AI environments
Data centre operators are rolling out dedicated private environments for regulated workloads. Expect hardened physical security, segregated networks, and granular access controls that pass audits while supporting dense AI clusters.
"We're not anti-AI-we're pro-control," says Jacob Mellor, CTO of Iron Software. "Teams shouldn't have to choose between productivity and data sovereignty. Our approach keeps customer data inside the boundary you set."
Finance teams also need precise energy and emissions data for reporting-down to circuit and tenant-while hosting high-performance computing for training and inference.
Contracts, capacity, and the grid
Colocation agreements are evolving: continuous compliance monitoring, regular technical audits, and explicit data residency commitments are becoming standard. The build-out is huge-JLL expects roughly 10 GW of new capacity to start in 2025-but electric grid availability is the choke point. Deloitte reports 72% of data centre executives view grid capacity as extremely challenging.
Meanwhile, the World Economic Forum projects AI spending in financial services could reach about $97 billion by 2027. Operators that deliver compliant, AI-ready facilities-plus transparent reporting-will earn a clear advantage. See context from the WEF here: AI in financial services.
What finance leaders should demand
- Proven data residency: geo-fencing, outbound egress blocks, DPI to prevent unintended transfers.
- Customer-controlled keys: HSMs, BYOK/External KMS, dual-control policy enforcement.
- Network segregation: private connectivity, east-west controls, micro-segmentation for AI clusters.
- Audit-ready monitoring: immutable logs, retention mapped to regulation, incident SLAs, tabletop tests.
- High-density readiness: 50-100 kW/rack, liquid cooling, hot aisle containment, N+1/2N redundancy.
- Energy transparency: tenant-level metering, PUE trends, carbon factors, quarterly assurance.
- Resilience and exit: tested restore procedures, offline backups, data portability, supplier substitution plans.
- Contractual safeguards: right to audit, breach notification windows, subcontractor disclosure, data location warranties.
- AI controls: private inference endpoints, on-prem or fenced foundation models, data minimisation and masking by default.
Practical next steps
- Map AI use cases to data classifications. Decide what runs on-prem, in regulated cloud, or via tightly controlled external models.
- Run a DORA-aligned third-party risk review for colocation and cloud. Include failover and incident drills.
- Pilot a sovereign AI stack: GPUs in a controlled site, strict egress policies, private LLMs, retrieval from approved datasets.
- Negotiate energy and emissions reporting clauses. Require facility-level and tenant-level metering.
- Plan capacity with the grid in mind: substation lead times, electrical upgrades, and cooling retrofits tied to AI roadmaps.
- Stand up model risk controls: red-teaming, prompt logging, human-in-the-loop checks for sensitive workflows.
Join the conversation
These themes will be front and centre at our New York breakfast roundtable on 29 January 2026: Breakfast at Tiffany's - Reimagining Financial Services Through AI. If you want practical training and tool lists built for finance teams, explore this resource: AI tools for finance.
Your membership also unlocks: