16 Jun 2025

Sydney team develop AI model to identify thoughts from brainwaves

9:55 am on 16 June 2025

By Catherine Hanrahan and Sharon Gordon, ABC News

An AI model to decode words and sentences from brainwaves is in development. (

An AI model to decode words and sentences from brainwaves is in development. Photo: ABC News / Sharon Gordon

What if you could operate your phone just by thinking about it?

And imagine your phone automatically enhancing your concentration and memory? Or even being used to read someone else's mind?

It sounds like science fiction, but this technology, called a brain-computer interface, is being supercharged with the advent of artificial intelligence (AI).

Australian researchers at Sydney's University of Technology (UTS) are at the forefront of exploring how AI can be used to read our minds.

Here's a walk-through how it works.

Using AI to read minds

The electrode cap is connected to amplifiers that read the brainwaves and feed data into the AI model.

The electrode cap is connected to amplifiers that read the brainwaves and feed data into the AI model. Photo: ABC News / Sharon Gordon

Postdoctoral research fellow Daniel Leong sits in front of a computer at the GrapheneX-UTS Human-centric Artificial Intelligence Centre wearing what looks like a rubber swimming cap with wires coming out of it.

The 128 electrodes in the cap are detecting electrical impulses in Dr Leong's brain cells and recording them on a computer.

It's called an electroencephalogram (EEG) and it's technology used by doctors to diagnose brain conditions.

The UTS team is using it to read his thoughts.

A pioneering AI model, developed by Dr Leong, PhD student Charles (Jinzhao) Zhou and his supervisor Chin-Teng Lin, uses deep learning to translate the brain signals from EEG into specific words.

Deep learning is a form of AI that uses artificial neural networks to mimic how the human brain works to learn from data, in this case, lots of EEG data.

Dr Leong thinks about each word and mouths it silently, which enhances the areas of the brain involved in speech recognition.

Dr Leong thinks about each word and mouths it silently, which enhances the areas of the brain involved in speech recognition. Photo: ABC News / Sharon Gordon

Dr Leong reads the simple sentence "jumping happy just me" slowly and silently on the screen.

He also mouths the words, which helps detect them by sending signals to the brain to activate the parts involved in speech.

The AI model works instantly to decode the words and come up with a probability ranking, based on what it has learned from lots of EEG waves from 12 volunteers reading texts.

At this stage, Professor Lin says the AI model has learned from a limited collection of words and sentences to make it easier to detect individual words.

The AI model has detected the individual words, based on the pattern of brainwaves. (

The AI model has detected the individual words, based on the pattern of brainwaves. Photo: ABC News / Sharon Gordon

A second type of AI, a large language model, matches the decoded words and corrects mistakes in the EEG coding to come up with a sentence.

Large language models, like ChatGPT, have been trained on massive text datasets to understand and generate human-like text.

"I am jumping happily, it's just me" is the sentence the AI model has come up with, with no input from Dr Leong apart from his brainwaves.

The AI model has come up with a predicted sentence based on Dr Leong's brainwaves, which is close to the original one he read.

The AI model has come up with a predicted sentence based on Dr Leong's brainwaves, which is close to the original one he read. Photo: ABC News / Sharon Gordon

Like a lot of things AI is doing at the moment, it's not perfect.

The team is recruiting more people to read text while wearing the EEG cap to refine the AI model.

They are also going to attempt to use the AI model to communicate between two people.

Brain-computer interfaces have been around for decades

Technology reading the brain's signals is steadily improving. (ABC News: Sharon Gordon)

Technology reading the brain's signals is steadily improving. Photo: ABC News / Sharon Gordon

Twenty years ago a man with quadriplegia had a device implanted in his brain that allowed him to control a mouse cursor on a screen.

It was the first time a brain-computer interface had been used to restore functions lost by paralysis.

Tech billionaire Elon Musk is working on the modern version of this implantable technology to restore autonomy to people with quadriplegia.

A non-invasive EEG brain-computer interface has the obvious advantage of being portable and not requiring surgery, but because it sits outside the brain, the signals are noisy.

Chin-Teng Lin says the AI model can identify words amid the noise generated by electroencephalogram signals.

Chin-Teng Lin says the AI model can identify words amid the noise generated by electroencephalogram signals. Photo: ABC News / Warwick Ford

"We can't get very precise because with non-invasive you can't actually put it into that part of the brain that decodes words," Professor Lin said.

"There's also some mix up, right? Since the signal you measure on the skull surfaces come from different sources and they mix up together."

That's where the AI comes in. It amplifies and filters the brain signals to reduce noise and generate speech markers.

Mohit Shivdasani is a bioelectronics expert at the University of NSW.

Researchers have been looking for the patterns in biological signals "forever", he said, but now AI can recognise brainwave patterns that have never been identified previously.

He said AI, particularly when used in implantable devices, could quickly personalise the brainwaves to how an individual completes a task.

Mohit Shivdasani says AI has huge potential in detecting the unknown brainwave patterns involved in cognitive functions.

Mohit Shivdasani says AI has huge potential in detecting the unknown brainwave patterns involved in cognitive functions. Photo: ABC News / Andrew Whitington

"What AI can do is very quickly be able to learn what patterns correspond to what actions in that given person. And a pattern that's revealed in one person may be completely different to a pattern that's revealed in another person," he said.

Professor Lin said that's exactly what they are doing to improve their AI model - by using "neurofeedback", which means the AI model tunes into the way different people speak.

"To help AI to learn better, we call this technology a kind of AI-human co-learning," he said.

The team is achieving about 75 percent accuracy converting thoughts to text, and Professor Lin said they were aiming for 90 percent, similar to what the implanted models achieve.

Huge potential in medicine, and beyond

Dr Shivdasani says the AI mind-reading technology could be used in stroke rehabilitation and speech therapy for autism. (

Dr Shivdasani says the AI mind-reading technology could be used in stroke rehabilitation and speech therapy for autism. Photo: ABC News / Andrew Whitington

Dr Shivdasani said non-invasive EEG that uses mind-reading AI has potential in managing stroke patients in hospitals.

"One of the awesome things about the brain is its ability to heal, so I can see a situation is where an autonomous brain-machine interface is used during the rehabilitation phase to allow the brain to keep working and to keep trying for a certain task," he said.

If the brain cells regenerate, the patient may no longer require the technology, he said.

Helping with speech therapy for people with autism is another potential use.

Such rehabilitative uses rely on a "closed loop" brain-computer interface, where real-time feedback comes from the user's brain activity.

Leaping into the realm of science fiction is the possibility of this technology to enhance our attention, memory, focus and even emotional regulation.

"As scientists, we look at a medical condition and we look at what function has been affected by that medical condition. What is the need of the patient? We then address that unmet need through technology to restore that function back to what it was," Dr Shivdasani said.

"After that, the sky's the limit."

The UTS team is working on perfecting their AI model to read thoughts in the mind.

The UTS team is working on perfecting their AI model to read thoughts in the mind. Photo: ABC News / Warwick Ford

Before we start operating our phones with our minds or even communicating directly from brain to brain, the technology needs to become more "wearable".

No-one is going to walk around in a cap with wires coming out of it.

Professor Lin said the technology could interact with devices like the augmented reality glasses on the market.

Big tech is already working on earbuds with electrodes to measure brain signals.

Then there's our "brain privacy" and other ethical considerations, Dr Shivdasani said.

"We have the tools but what are we going to use them for? And how ethically are we going to use them? That's with any technology that allows us to do things we've never been able to do."

- ABC News

Get the RNZ app

for ad-free news and current affairs