AI-Assisted Brain-Computer Interface Offers New Hope for People With Paralysis

Researchers created a noninvasive brain-computer interface using AI to help users control robotic arms and cursors faster and more accurately. A paralyzed participant completed tasks only possible with this AI-assisted system.

Published on: Sep 02, 2025
AI-Assisted Brain-Computer Interface Offers New Hope for People With Paralysis

Noninvasive Brain-Computer Interface Enhanced with Artificial Intelligence

Researchers have developed a noninvasive brain-computer interface (BCI) system that uses artificial intelligence (AI) to improve users’ ability to control robotic arms or cursors. This system converts brain signals recorded via electroencephalography (EEG) into movement commands, while an AI-powered camera interprets the user’s intent in real time. Tests showed participants, including a paralyzed individual, completed tasks faster and more accurately with AI assistance—achieving actions that were impossible without it.

Key Facts

  • Noninvasive Breakthrough: Combines EEG-based brain signal decoding with AI vision for shared autonomy.
  • Faster Task Completion: Paralyzed participants accomplished tasks they couldn’t complete without AI support.
  • Accessible Alternative: Provides safer, lower-risk options compared to invasive surgical implants.

Overview of the Research

Engineers at UCLA created a wearable, noninvasive BCI that integrates AI as a co-pilot to interpret user intent and assist in moving a robotic arm or computer cursor. Detailed in Nature Machine Intelligence, the system demonstrated unprecedented performance for noninvasive BCIs. Notably, a paralyzed participant completed a robotic arm task in about six-and-a-half minutes with AI help—an achievement not possible without it.

This technology could expand assistive options for people with paralysis or neurological disorders, making it easier to handle and move objects precisely. The team designed custom algorithms to decode EEG brain signals that indicate movement intention. These signals are paired with an AI camera system that interprets user intent continuously, enabling faster and more accurate task completion.

“By using AI to complement brain-computer interfaces, we aim to offer safer, less invasive solutions,” explained the lead researcher. “Our goal is to develop AI-BCI systems that provide shared autonomy, helping individuals with movement disorders like paralysis or ALS regain independence in daily activities.”

Challenges with Current Brain-Computer Interfaces

While implanted BCIs can translate brain signals into commands, their benefits are often overshadowed by the risks and costs of neurosurgery. After more than 20 years since their initial demonstrations, such devices remain limited to small clinical trials. On the other hand, wearable BCIs have shown lower accuracy and reliability in decoding brain signals.

To tackle these issues, the research team tested their AI-assisted, noninvasive BCI with four participants—three without motor impairments and one paralyzed from the waist down. Participants wore EEG caps, and the team used custom algorithms to translate brain signals into cursor and robotic arm movements. An AI system with a camera observed these movements and helped complete two tasks.

  • Move a cursor on a screen to hit eight targets, holding each for at least half a second.
  • Use a robotic arm to move four blocks on a table from their original locations to designated spots.

All participants completed these tasks significantly faster with AI assistance. The paralyzed participant was able to finish the robotic arm task only with AI support, taking about six-and-a-half minutes. Without AI, the task was impossible.

The BCI decoded electrical brain signals related to intended actions. Meanwhile, the AI system used computer vision to interpret user intent—not just eye movements—guiding the cursor and positioning blocks effectively.

“Future AI-BCI systems could feature more advanced co-pilots that control robotic arms with greater speed and precision, adapting their grip to different objects,” said a co-lead researcher. “Increasing training data could further improve AI collaboration on complex tasks and enhance EEG decoding accuracy.”

Research Team and Funding

This study was conducted by members of UCLA’s Neural Engineering and Computation Lab. The team includes experts in electrical and computer engineering, neuroscience, and AI. The project received funding from the National Institutes of Health and the Science Hub for Humanity and Artificial Intelligence, a collaboration between UCLA and Amazon. UCLA’s Technology Development Group has also applied for a patent related to this AI-BCI technology.

For professionals interested in artificial intelligence applications in healthcare and assistive technology, exploring AI’s role in brain-computer interfaces offers promising avenues. To learn more about AI technologies and training, visit Complete AI Training.