Sydney Scientists Build AI That Reads Words Directly From Brainwaves

Researchers at the University of Technology Sydney developed an AI model that decodes words from brainwaves using EEG caps. This non-invasive tech could enable thought-to-text communication with 75% accuracy.

Categorized in: AI News Science and Research
Published on: Jun 16, 2025
Sydney Scientists Build AI That Reads Words Directly From Brainwaves

AI Model Developed in Sydney to Decode Thoughts from Brainwaves

What if you could control your phone just by thinking? Or have your device boost your focus and memory automatically? These ideas might sound like science fiction, but researchers at the University of Technology Sydney (UTS) are making strides in brain-computer interface technology powered by artificial intelligence (AI). They have developed an AI model that decodes words and sentences directly from brainwaves.

How AI Reads Minds Using EEG

The process begins with a cap fitted with 128 electrodes that detect electrical impulses in the brain, known as an electroencephalogram (EEG). This non-invasive technology records brain activity, which is then fed into an AI model. Dr. Daniel Leong, a postdoctoral researcher at UTS, wears this cap as the electrodes capture his brain signals in real time.

The AI model, developed by Dr. Leong, PhD student Charles (Jinzhao) Zhou, and Professor Chin-Teng Lin, applies deep learning to translate these EEG signals into specific words. Deep learning uses artificial neural networks to learn patterns from large datasets—in this case, EEG data collected from volunteers reading texts silently. Dr. Leong silently mouths the words, which activates brain regions involved in speech and improves detection accuracy.

For example, when Dr. Leong thinks of the phrase "jumping happy just me" and mouths it silently, the AI analyzes the brainwaves and produces a probability ranking of possible words. To refine the output, a second AI system—a large language model similar to ChatGPT—matches the decoded words and corrects errors, forming coherent sentences. The final result was "I am jumping happily, it's just me."

Currently, the AI model learns from a limited word set, making individual word detection easier. The team is expanding their dataset by recruiting more participants and plans to explore real-time communication between two individuals using this technology.

Brain-Computer Interfaces: Progress and Challenges

Brain-computer interfaces (BCIs) have existed for decades. Early breakthroughs include implantable devices that allowed people with paralysis to control computer cursors. Today, tech leaders like Elon Musk are advancing these implantable systems to restore autonomy for quadriplegic patients.

Non-invasive EEG BCIs, like the one developed at UTS, offer portability and avoid surgery, but the signals tend to be noisy because the electrodes sit on the scalp rather than inside the brain. Professor Lin explains that AI helps filter and amplify these signals to reduce noise and isolate speech-related brain activity. However, the precision is limited since signals from various brain regions mix together on the skull's surface.

Bioelectronics expert Mohit Shivdasani from the University of New South Wales notes that AI can now recognize brainwave patterns that were previously undetectable. AI can also personalize brainwave decoding for individuals, especially in implantable devices. The UTS team employs a form of "neurofeedback," where the AI adapts to how different people think and speak, creating a co-learning system between human and machine.

The current accuracy rate of thought-to-text conversion is about 75%, with a target of reaching 90%, comparable to implantable BCI systems.

Applications in Medicine and Beyond

Mind-reading AI via EEG has promising applications in stroke rehabilitation and speech therapy for autism. Dr. Shivdasani highlights that autonomous brain-machine interfaces could assist stroke patients by encouraging brain activity during recovery phases, potentially reducing the need for long-term device use.

Speech therapy for autism could benefit from closed-loop BCIs that provide real-time feedback based on brain activity. Beyond medical uses, this technology might enhance attention, memory, and emotional regulation, though these applications remain speculative and require further research.

Future Directions and Ethical Considerations

Before brain-controlled devices become mainstream, the technology needs to become more practical and wearable. A cap with electrodes and wires is not suitable for everyday use. Professor Lin envisions integration with devices like augmented reality glasses or earbuds equipped with electrodes to monitor brain signals discreetly.

Ethical issues around privacy and the responsible use of brain data are critical. Dr. Shivdasani emphasizes the need to define clear guidelines for how this powerful technology should be applied to protect individuals' cognitive privacy and autonomy.

As this research progresses, it will be important to balance technological possibilities with ethical safeguards.

For those interested in the intersection of AI and neuroscience, exploring AI courses related to brain-computer interfaces and deep learning could provide valuable insights into this emerging field.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)
Advertisement
Stream Watch Guide