Verda and Compal partner to supply GPU servers for AI cloud expansion in Europe and APAC

Compal Electronics will supply liquid-cooled GPU servers to Helsinki-based Verda, expanding AI compute capacity across Europe and Asia-Pacific. The deal targets enterprises that need data to stay within specific regions due to local compliance laws.

Categorized in: AI News IT and Development
Published on: May 08, 2026
Verda and Compal partner to supply GPU servers for AI cloud expansion in Europe and APAC

Compal and Verda Partner on GPU Infrastructure for European AI Cloud

Compal Electronics will supply high-density GPU server systems to Verda, a Helsinki-based AI cloud provider, to expand AI infrastructure across Europe and the Asia-Pacific region. The partnership addresses growing demand for localized compute capacity as enterprises prioritize data residency and regulatory compliance.

Verda operates data centers powered by renewable energy and serves AI labs and research teams training frontier models. The company focuses on agentic applications-systems that process extensive context and run at high concurrency-which require specialized hardware and thermal management.

Compal's servers use liquid cooling to handle the power density of modern AI workloads while maintaining efficiency. The manufacturer brings expertise in accelerated computing and system integration, areas critical for scaling AI infrastructure.

Why This Matters for Infrastructure Teams

The deal reflects a shift in how organizations source compute. Rather than relying solely on hyperscalers, companies are turning to regional providers that offer control over data location and energy efficiency. This matters for teams managing compliance requirements or seeking alternatives to centralized cloud providers.

Compal operates manufacturing facilities across Taiwan, Vietnam, and the United States, which helps distribute supply-chain risk and align production with regional demand. This geographic spread reduces lead times for customers in different regions.

The Broader Context

Verda's focus on sovereign AI-infrastructure that keeps data within specific regions-addresses a real concern for governments and enterprises. Data residency laws in the EU and elsewhere make regional compute capacity a necessity, not a preference.

The partnership also signals how hardware manufacturers are adapting to AI's demands. Standard server designs don't cut it for training large models or running inference at scale. Thermal design, power delivery, and interconnect speed all matter more than they did in previous generations.

For IT and development professionals, this underscores the importance of understanding infrastructure constraints when designing AI systems. Where your compute runs, how it's cooled, and who controls it are no longer afterthoughts-they're architectural decisions.

Learn more about Generative AI and LLM development or explore AI for IT & Development topics.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)