Microsoft's Light-Based Computer Promises 100x Energy Efficiency for AI and Optimization Tasks

Microsoft’s new analog optical computer uses light to boost AI energy efficiency by 100 times. It excels in specific AI and optimization tasks, cutting energy use drastically.

Categorized in: AI News Science and Research
Published on: Sep 10, 2025
Microsoft's Light-Based Computer Promises 100x Energy Efficiency for AI and Optimization Tasks

Microsoft’s Light-Based Computer Could Boost AI Efficiency by 100 Times

A new computing prototype developed by Microsoft uses light instead of traditional digital switches to perform calculations, potentially cutting the energy demands of artificial intelligence (AI) by a factor of 100. This analog optical computer (AOC) introduces a fresh approach to computation that could outperform current digital systems in energy efficiency for specific AI and optimization tasks.

Unlike digital computers that operate by flipping billions of tiny switches, the AOC relies on micro-LEDs and camera sensors to manipulate light and voltage signals. These analog signals interact in a feedback loop, iteratively refining the solution until reaching a steady state—a final answer to the problem at hand.

Energy Efficiency and Speed Advantages

One of the significant benefits of the AOC is that it processes analog signals directly without converting them to digital form during computation. This eliminates energy loss typically associated with analog-to-digital conversion and bypasses some speed limitations inherent in digital computing.

According to Jannes Gladrow, a Microsoft AI researcher involved in the study, this approach offers approximately a hundredfold improvement in energy efficiency—something unprecedented in current hardware.

However, the AOC is not a general-purpose computer. It is specialized to find steady-state solutions for certain AI problems and optimization challenges, making it highly efficient in those contexts but limited for broader computing tasks.

A New Computational Paradigm and Digital Twin

Microsoft’s team has also created a “digital twin,” a software model that simulates the AOC’s physical computations. This digital twin can be scaled to tackle more complex problems beyond the current prototype’s capacity.

Michael Hansen, senior director at Microsoft Health Futures, explained that the digital twin allows researchers to work on larger variable sets and more intricate calculations that the physical device cannot yet handle.

Performance on Machine Learning and Optimization Tasks

  • The AOC prototype has successfully performed simple machine learning tasks like image classification, matching the results of digital computers.
  • The digital twin reconstructed a 320-by-320-pixel brain scan from just 62.5% of the original data, hinting at potential for faster MRI scans.
  • The AOC solved complex financial optimization problems involving fund exchanges and risk minimization more effectively than some quantum computing approaches.

Looking Ahead: Scaling the Technology

Currently, the AOC is at the prototype stage. Future versions with more micro-LEDs could process millions or billions of variables simultaneously, greatly expanding their capabilities.

Hitesh Ballani from Microsoft’s Cloud Systems Futures team envisions the AOC becoming an integral part of future computing landscapes, especially where energy efficiency and speed are critical.

This development marks a promising step toward hardware that meets the growing computational demands of AI while addressing sustainability concerns.

For professionals interested in the intersection of AI and advanced computing hardware, exploring emerging technologies like the AOC could offer valuable insights into future research and applications.

Learn more about AI advancements and training resources at Complete AI Training.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)