AI decodes brain activity to reveal the meaning behind thoughts

AI decodes brain activity to reveal the meaning behind thoughts

Artificial intelligence (AI) has made non-invasive mind-reading possible by converting thoughts into text, according to a recent study published in the journal Nature Neuroscience. This achievement has the potential to revolutionize the field of communication, especially for patients struggling with speech after a stroke or motor neuron disease. The innovation involves an AI-based decoder that translates brain activity into a continuous stream of text with high accuracy. The system works by mapping patterns of neuronal activity to strings of words with particular meanings, rather than attempting to read out activity word by word.

Previously, language decoding systems required surgical implants. However, the new breakthrough offers a non-invasive solution. The decoder was developed by neuroscientists at the University of Texas at Austin who used functional magnetic resonance imaging (fMRI) to reconstruct speech from brain activity with uncanny accuracy. This achievement overcomes a fundamental limitation of fMRI, which is that while the technique can map brain activity to a specific location with incredibly high resolution, there is an inherent time lag, which makes tracking activity in real-time impossible.

The team trained the decoder using a large language model, GPT-1, a precursor to OpenAI’s ChatGPT, and used fMRI scans to generate text from brain activity alone. About half of the time, the text closely matched the intended meanings of the original words, and sometimes it precisely matched the meaning. The decoder worked at the level of ideas, semantics, and meaning rather than exact words.

The decoder also showed accuracy when participants watched short silent videos, and their brain activity accurately described some of the content. However, the system struggled with certain aspects of language, including pronouns. The decoder was personalized, and when the model was tested on another person, the readout was unintelligible.

The achievement opens up a host of experimental possibilities, including reading thoughts from someone dreaming or investigating how new ideas spring up from background brain activity. This breakthrough is technically impressive, and it can be a basis for the development of brain-computer interfaces. The team now hopes to assess whether the technique could be applied to other, more portable brain-imaging systems, such as functional near-infrared spectroscopy (fNIRS).

More from Qonversations

Tech

Rorisang Mahomo Pretoria

South African student discovers Asteroid amidst business studies

Tech

Screenshot 2024 12 18 at 12.43.02 AM

Powering Ahead: China’s EV trucks set to disrupt the industry?

Tech

Screenshot 2024 12 16 at 5.35.03 PM

Explainer: Arm vs Qualcomm and the battle over Nuvia Tech

Tech

Screenshot 2024 12 12 at 5.28.16 PM

Is Grok the AI revolution we’ve been waiting for?

Front of mind