Compal and Verda Partner on GPU Infrastructure for European AI Cloud
Compal Electronics will supply high-density, liquid-cooled AI server systems to Verda, a Helsinki-based cloud provider, to expand AI infrastructure across Europe and Asia-Pacific. The partnership addresses growing demand for regional compute capacity as enterprises and governments prioritize data residency and regulatory compliance.
Verda operates GPU data centers powered by renewable energy and serves AI labs, research teams, and startups training frontier models. The company's infrastructure is designed for agentic applications-systems that process large amounts of context and handle high concurrent workloads-while maintaining thermal efficiency.
Why This Matters for Infrastructure Teams
The partnership reflects a shift in how AI compute is being deployed. Rather than centralized cloud regions, organizations are building distributed infrastructure closer to users and data sources. This approach reduces latency, improves compliance with local regulations, and gives teams more control over where their models run.
Compal brings expertise in accelerated computing and thermal design-critical for managing the power density and heat output of GPU-heavy systems. The company operates manufacturing facilities across Taiwan, Vietnam, and the United States, giving it supply-chain flexibility for regional deployments.
Alan Chang, Vice President of Infrastructure Solutions at Compal, said the collaboration "demonstrates our ability to deliver advanced AI systems at scale for customers building the next generation of AI clouds." Jorge Santos, Chief Operating Officer at Verda, called the partnership "an important step in our plans to expand our presence in the APAC region."
The Broader Context
This deal signals growing competition among cloud providers to serve specific regions and workloads. As enterprises move beyond general-purpose cloud services, infrastructure providers are specializing. Verda's focus on renewable-powered, high-performance compute for frontier model training positions it differently from hyperscalers.
For development and infrastructure teams, this means more options for where and how to run AI workloads. It also means evaluating tradeoffs: regional providers may offer better compliance and latency characteristics but with different pricing and feature sets than global providers.
Compal, established in 1984, has expanded beyond consumer devices into cloud servers and automotive electronics. The company ranks among Forbes Global 2000 companies and has been recognized by CommonWealth Magazine as one of Taiwan's top manufacturers.
Learn more about AI for IT & Development to understand how infrastructure decisions impact your technical stack.
Your membership also unlocks: