From Classrooms to Cars: Spiking AI Cuts Energy Use 10-20x at the Edge

UOC team runs spiking neural nets on affordable edge gear, under 10W and microjoules per op. In driving tests, they stay accurate while using 10-20x less energy.

Categorized in: AI News IT and Development
Published on: Jan 23, 2026
From Classrooms to Cars: Spiking AI Cuts Energy Use 10-20x at the Edge

Low-power AI at the edge: UOC team advances spiking neural networks for real workloads

AI runs on electricity. The International Energy Agency estimates data centres already draw roughly 1.5% of global electricity, with demand set to double by 2030 if trends hold. For engineering teams, that's a scaling bottleneck-and a cost line you can't ignore.

Researchers at the Universitat Oberta de Catalunya (UOC) share two studies that push a clear alternative: spiking neural networks (SNNs) built for low-power, high-performance deployment on accessible hardware. The goal is straightforward-treat energy as a first-class design constraint so teams can ship AI that's sustainable, affordable, and resilient in weak-connectivity environments.

Why this matters for IT and development teams

Energy-aware AI reduces infrastructure spend, heat, and latency. It also enables on-device inference for sensors, robots, and vehicles-without relying on GPUs or persistent cloud links. That's better for privacy, better for uptime, and better for budgets.

Study 1: Distributed SNNs on low-cost edge hardware

The first study shows SNNs running on off-the-shelf components-Raspberry Pi 5 paired with a BrainChip Akida accelerator-delivering high performance while staying under 10 watts. The workflow covers training, conversion, and deployment without a data centre or GPU.

Devices coordinate using protocols you already know: MQTT (Message Queue Telemetry Transport), SSH, and Vehicle-to-Everything (V2X). Results can be shared in under a millisecond with an energy cost of about 10-30 microjoules per operation. Think distributed, low-latency AI for transport, environmental monitoring, and industrial IoT.

This approach also improves accessibility and privacy. Schools, hospitals, rural deployments, and resource-constrained teams can run meaningful AI locally-safely and sustainably.

Study 2: Energy-aware autonomous driving with SNNs

The second study compares SNNs to standard convolutional networks on tasks like steering angle prediction and obstacle detection. With the right encoding, SNNs deliver a strong accuracy/energy balance while using 10-20x less energy than CNNs.

The team also proposes a practical metric that blends accuracy with consumption, helping engineers choose models based on real deployment costs instead of benchmark scores alone.

What to do next (practical steps)

  • Treat energy as a KPI. Track watts, joules/inference, and latency alongside accuracy.
  • Prototype a low-power stack: Raspberry Pi 5 + Akida for SNN acceleration, with containerized services.
  • Use MQTT for lightweight messaging; secure sessions with SSH; plan for V2X where mobility matters.
  • Build an edge-first pipeline: local preprocessing, on-device inference, selective uplink of summaries.
  • Adopt an efficiency metric in model selection-opt for the best accuracy per joule, not just per FLOP.
  • Target deployments where privacy and resilience matter: clinics, schools, factories, rural sites.

Where this fits

Both studies contribute to practical, energy-thrifty AI systems that are easier to deploy and maintain. They align with UN SDGs 9 (Industry, Innovation and Infrastructure), 11 (Sustainable Cities and Communities), and 13 (Climate Action)-but the benefit is immediate for engineering teams: less power, less heat, lower cost, same business impact.

Further reading


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide