SK Telecom, Arm and Rebellions partner to build CPU-NPU server solution for AI inference

SK Telecom, Arm, and Rebellions signed an MOU on April 9 to build AI inference servers pairing Arm's AGI CPU with Rebellions' RebelCard NPU. SK Telecom will test the setup in its own data centers.

Categorized in: AI News IT and Development
Published on: Apr 13, 2026
SK Telecom, Arm and Rebellions partner to build CPU-NPU server solution for AI inference

SK Telecom, Arm, and Rebellions team up on AI inference servers

SK Telecom signed a memorandum of understanding with Arm and Rebellions on April 9 to develop an AI server combining CPUs and neural processing units (NPUs). The three companies will integrate Arm's "Arm AGI CPU" with Rebellions' "RebelCard™" NPU and test the solution at SK Telecom's data centers.

The partnership addresses a fundamental shift in AI infrastructure priorities. As the industry moves from training models to running inference at scale, the focus has moved from raw computing power to delivering results faster and cheaper with less energy consumption.

Why NPUs matter for inference

AI inference-processing data repeatedly and quickly to deliver real-time services-runs continuously in production. GPUs handle this work but often waste power doing it. NPUs are built specifically for inference tasks and consume far less energy, which directly affects operating costs.

A single server needs both types of chips to work. CPUs manage data input/output, network communication, memory, and scheduling. NPUs focus only on the AI computation itself. This division of labor improves overall efficiency.

What the partnership delivers

Arm's "Arm AGI CPU" is the company's first data center processor designed for AI inference. Rebellions' "RebelCard™" NPU handles large-scale inference workloads. Together, they form what SK Telecom describes as an efficient server architecture for running AI services at scale.

SK Telecom will deploy the solution in its own data centers to test performance and stability. The company is also considering running its proprietary foundation model, A.X K1, on the new platform.

For IT professionals and developers, understanding CPU-NPU integration is becoming essential as inference workloads dominate production deployments. See AI for IT & Development for more on infrastructure considerations, or explore the AI Learning Path for Software Engineers to understand system architecture optimization.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)