How brain-inspired chips could make AI more energy efficient

AI models consume much more energy than the human brain. Neuromorphic computing mimics brain functions to create energy-efficient AI hardware with improved adaptability.

Categorized in: AI News Science and Research
Published on: Jul 02, 2025
How brain-inspired chips could make AI more energy efficient

How AI Can Become More Energy Efficient by Emulating the Human Brain

Artificial intelligence models can consume staggering amounts of energy—over 6,000 joules to generate a single text response. In stark contrast, the human brain requires roughly 20 joules per second to maintain all its functions. This vast difference in energy efficiency drives researchers at the University at Buffalo to look to the brain for inspiration in designing AI hardware that meets rising energy demands.

"There's nothing as efficient as our brain—it has evolved to maximize information processing and storage while minimizing energy use," explains Sambandamurthy Ganapathy, Ph.D., a professor of physics. While replicating the brain's complexity is unrealistic, mimicking its information handling methods could lead to computers—and AI—that use energy far more efficiently.

Neuromorphic Computing: Brain-Inspired AI Hardware

Neuromorphic computing, a concept dating back to the 1980s, is gaining renewed interest as AI workloads grow more complex and energy-intensive. This approach involves designing hardware that mimics the brain’s way of processing and storing information.

Ganapathy’s team focuses on creating neuromorphic chips by combining quantum science and engineering. Their work centers on materials with unique electrical properties suitable for artificial neurons and synapses, aiming to build devices that operate more efficiently and perform tasks in ways closer to human cognition.

Similarities Between Computers and the Brain

At a fundamental level, computers and brains share some operational similarities. Computers encode data in binary using billions of transistors that switch on or off, representing ones and zeros. Similarly, the brain uses billions of neurons that either fire electrical signals or remain silent.

Neuromorphic computing seeks to move beyond simple binary states toward the brain’s far more complex and dynamic signaling system. This shift could allow computers to process information in richer, more adaptable ways.

Unified Memory and Processing

A critical difference between brains and traditional computers lies in how memory and processing are organized. The brain stores and processes information in the same physical space, with no strict separation between the two.

In contrast, conventional computers separate memory and processing units, which requires significant energy to shuttle data back and forth. This gap becomes even more pronounced when running AI models.

Neuromorphic computing embraces in-memory computing, positioning memory and processing units close together on the chip to cut energy use and improve efficiency.

Creating Artificial Neurons and Synapses

The brain’s neurons communicate via synapses, which transmit electrical signals and store memory. Neuromorphic hardware aims to replicate this by developing artificial neurons and synapses that emulate this electrical signaling.

Ganapathy’s team works with advanced materials whose electrical conductivity can be precisely controlled to mimic the rhythmic, synchronized oscillations found in the brain.

The Role of Phase-Change Materials (PCMs)

Phase-change materials are central to this research. PCMs can switch between conductive and resistive states when subjected to electrical pulses, and crucially, they retain these states after the pulse ends. This property effectively provides non-volatile memory similar to synapses strengthening through repeated activation.

Materials such as copper vanadium oxide bronze, niobium oxide, and metal-organic frameworks are being studied for their suitability in neuromorphic chips. The team investigates how voltage and temperature affect these materials down to the electron level to achieve atomic-scale control over their switching behavior.

Building Oscillatory Neural Networks

The next step is synchronizing the oscillations of multiple neuromorphic devices to form networks capable of complex brain-like functions, including pattern recognition and motor control. This could enable AI hardware that performs tasks with a level of flexibility and efficiency closer to biological systems.

More Human-Like AI Processing

Neuromorphic computing replicates brain function at a behavioral level, not consciousness. Unlike traditional computers, which follow fixed, linear logic, neuromorphic systems may process information nonlinearly, adapting to ambiguous or limited data better.

This adaptability could be particularly impactful for applications like self-driving cars, where AI must handle unpredictable scenarios that typical algorithms struggle with.

Neuromorphic chips could enable real-time decision-making on the device itself, improving responsiveness and safety in autonomous vehicles. Rather than a single chip handling all tasks, multiple specialized neuromorphic chips might each focus on discrete problems, such as navigation or obstacle detection.

Conclusion

By drawing inspiration from the brain’s efficient architecture, neuromorphic computing offers a promising path to AI systems that are both energy efficient and better at handling complex, real-world tasks. This approach could redefine how AI hardware is designed, especially for applications demanding low energy consumption and high adaptability.

For those interested in expanding their knowledge of AI technologies and related hardware innovations, explore relevant AI courses and resources to stay current with emerging trends.