VISION: Voice-Controlled AI Assistant Breaks Barriers for Scientific Discovery at Brookhaven Lab

Brookhaven National Laboratory developed VISION, a voice-controlled AI assistant that operates scientific instruments via natural language. It speeds experiments and frees researchers to focus on discovery.

Categorized in: AI News Science and Research
Published on: Jun 27, 2025
VISION: Voice-Controlled AI Assistant Breaks Barriers for Scientific Discovery at Brookhaven Lab

VISION: A Voice-Controlled AI Assistant for Scientific Discovery

Introduction

A team of scientists at the U.S. Department of Energy’s Brookhaven National Laboratory has developed a voice-controlled AI assistant named VISION (Virtual Scientific Companion). This generative AI tool helps busy researchers overcome barriers when working with complex scientific instruments. VISION can bridge knowledge gaps, speed up experiments, and save valuable time, ultimately accelerating scientific progress.

The Concept of VISION

VISION allows users to communicate in plain language to operate scientific instruments. Whether it’s running an experiment, launching data analysis, or visualizing results, the AI companion takes over the task. The tool is customized for each instrument to provide precise support. Details about VISION were recently published in Machine Learning: Science and Technology.

Scientists often spend excessive time on routine tasks. VISION acts as a conversational assistant, answering basic questions about instrument capabilities and operations, freeing researchers to focus on their science.

Collaboration and Development

VISION is the result of collaboration between Brookhaven’s Center for Functional Nanomaterials (CFN) and the National Synchrotron Light Source II (NSLS-II), both DOE Office of Science user facilities. These teams work with users on scientific planning and data analysis from experiments at NSLS-II’s beamlines, which use X-ray beams to study material structures.

To address bottlenecks at the highly sought-after beamlines, the project received support through a DOE Early Career Award in 2023, enabling the development of this AI assistant concept.

Functionality of VISION

VISION is built on large language models (LLMs), the technology behind popular AI assistants like ChatGPT. These models generate natural language text, but VISION goes further by generating decisions and computer code to control instruments.

The system is organized into multiple “cognitive blocks” or “cogs,” each powered by an LLM that handles specific tasks. These cogs work together to provide a seamless experience, executing commands transparently for the user.

For example, a researcher can say, "I want to select certain detectors," or "Take a measurement every minute for five seconds," and VISION converts these commands into executable code. Users can review the generated code on the beamline workstation before running it.

Advantages of Natural Language Processing

VISION’s strength lies in its natural language interface. Because it is instrument-specific, researchers no longer need to manually configure software parameters. This streamlines experiments and lets scientists dedicate more time to analysis and discovery.

Noah van der Vleuten, a key developer of VISION’s code generation, noted that this approach enhances efficiency and reduces routine workload for researchers.

Future Developments

With the core architecture in place and a working demonstration at the CMS beamline, the team plans to test VISION further with beamline scientists and users. The goal is to expand its use to other beamlines based on real user feedback.

Kevin Yager, leader of the AI-Accelerated Nanoscience Group at CFN, sees VISION as a foundation for broader AI applications across the DOE complex. He envisions a network of AI agents working together to support scientific research more effectively.

Conclusion

VISION is supported by the DOE Early Career Research Program and the DOE Office of Science. Brookhaven National Laboratory continues to focus on advancing basic research in physical sciences by developing innovative tools that address pressing scientific challenges.