Greener AI in Europe: Scaleway and Fujitsu test MONAKA CPU with 1.9x energy and 1.7x cost gains

Scaleway and Fujitsu are piloting MONAKA CPU inference to cut energy and costs for European AI. Early tests show up to 1.9x energy and 1.7x cost gains, with support for 70B models.

Published on: Dec 07, 2025
Greener AI in Europe: Scaleway and Fujitsu test MONAKA CPU with 1.9x energy and 1.7x cost gains

MONAKA Processor Promises Sustainable AI for European Firms

Scaleway and Fujitsu are partnering to test CPU-based AI inference as a more sustainable, cost-aware path for European organizations. The effort centers on Fujitsu's MONAKA processor and Scaleway's cloud platform, with a clear focus on energy use, total cost, and reliable performance at scale. Early results from Fujitsu show up to 1.9x better energy efficiency and up to 1.7x better cost efficiency on select inference workloads. MONAKA also supports models up to 70B parameters, opening room for large-scale deployments without ballooning infrastructure bills.

Why this matters

As AI moves from pilots to always-on production, cost per request and energy draw become hard constraints. Many teams don't need peak training throughput-they need predictable, efficient inference that respects EU data requirements. CPU-based options like MONAKA aim to complement GPU fleets, helping teams match the right architecture to the job. The result: more choices for meeting SLAs while keeping budgets and environmental impact in check.

What MONAKA brings

  • Energy efficiency: Early tests indicate up to 1.9x improvement on selected inference workloads.
  • Cost efficiency: Up to 1.7x gains reported in early measurements for specific tasks.
  • Scale: Support for models up to 70B parameters for high-demand use cases.
  • Reliability: Built for steady, predictable inference performance.
  • Real-world testing: Deployment trials planned within Scaleway's cloud to validate results in production-like conditions.

The collaboration targets Europe's need for infrastructure that balances performance with data sovereignty. By expanding CPU-based options alongside existing stacks, Scaleway aims to provide clear guidance on performance, cost, and environmental impact across workload profiles.

What this means for your team

  • IT and Infra: Consider CPU inference for steady, high-volume traffic where latency targets are modest and cost per token/request dominates.
  • Developers: Review model formats, quantization, and batching strategies tuned for CPU to squeeze more throughput per watt.
  • Product: Plan capacity with true TCO in mind-hardware, energy use, and regional data compliance.
  • Security and Compliance: Keep data residency within the EU while maintaining transparency on environmental metrics.

How Scaleway and Fujitsu will validate

The teams will test MONAKA across real inference scenarios on Scaleway's platform. The goal is to publish practical guidance that maps workload types to the right infrastructure, along with clear performance and cost data. This helps engineering and product leaders make informed decisions without guesswork. It also supports Europe's push for a more sustainable digital footprint.

Who should explore this now

  • Teams running large LLMs or vision models with steady production traffic.
  • Organizations under strict EU data and sustainability objectives.
  • Builders looking to reduce spend without sacrificing throughput or uptime.

To learn more about the companies behind this effort, see Fujitsu and Scaleway. If you're upskilling teams for efficient AI deployment and inference ops, explore role-based learning paths at Complete AI Training.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide