New portable, non-invasive AI system turns thoughts into text

168

Sydney, Dec 12 (IANS) Australian researchers have developed a portable, non-invasive system that can decode silent thoughts and turn them into text using artificial intelligence (AI).

The technology could aid communication for people who are unable to speak due to illness or injury, including stroke or paralysis.

It could also enable seamless communication between humans and machines, such as the operation of a bionic arm or robot.

Researchers from the GrapheneX-UTS Human-centric Artificial Intelligence Centre at the University of Technology Sydney (UTS) who developed the system claimed it to be a “world-first”.

In the study, participants silently read passages of text while wearing a cap that recorded electrical brain activity through their scalp using an electroencephalogram (EEG).

The EEG wave is segmented into distinct units that capture specific characteristics and patterns from the human brain. This is done by an AI model called DeWave developed by the researchers.

DeWave translates EEG signals into words and sentences by learning from large quantities of EEG data.

“This research represents a pioneering effort in translating raw EEG waves directly into language, marking a significant breakthrough in the field,” said Professor C.T. Lin, Director of the GrapheneX-UTS HAI Centre.

“It is the first to incorporate discrete encoding techniques in the brain-to-text translation process, introducing an innovative approach to neural decoding. The integration with large language models is also opening new frontiers in neuroscience and AI,” he said.

Previous technology to translate brain signals to language has either required surgery to implant electrodes in the brain, such as Elon Musk’s Neuralink, or scanning in an MRI machine, which is large, expensive, and difficult to use in daily life.

These methods also struggle to transform brain signals into word level segments without additional aids such as eye-tracking, which restrict the practical application of these systems.

The new technology is able to be used either with or without eye-tracking. The new research was carried out with 29 participants. This means it is likely to be more robust and adaptable than previous decoding technology that has only been tested on one or two individuals, because EEG waves differ between individuals.

The use of EEG signals received through a cap, rather than from electrodes implanted in the brain, means that the signal is noisier.

The translation accuracy score is currently around 40 per cent, which the team hopes to scale upto 90 per cent. The study was selected as the spotlight paper at the NeurIPS conference, a top-tier annual meeting that showcases world-leading research on artificial intelligence and machine learning, held in New Orleans, US.

LEAVE A REPLY

Please enter your comment!
Please enter your name here