Reading Your Mind With AI: Researchers Decode Brain Signals To Reconstruct Words And Thoughts

Researchers have discovered a method of using MRI scans and artificial intelligence (AI) technology to decipher a person’s stream of thoughts and words in the brain. The system, described in the journal Nature Neuroscience, reconstructs the meaning behind the words a person hears or imagines instead of attempting to replicate each word.

Don’t Miss:

Decoding language through magnetic resonance imaging (MRI) scans and artificial intelligence has promising potential for people with brain injuries or diseases that impair speech. In addition to its potential clinical applications, the research is also aiding scientists in unraveling the mysteries of how the brain processes words and thoughts.

As AI and neuroscience continue to advance, companies like GenesisAI may play a crucial role in bringing this technology to market and making it more widely accessible. GenesisAI is already using AI to enable businesses and individuals to create, analyze and share complex datasets in a fraction of the time and cost it would take through traditional methods. 

Until now, efforts to decode language have used sensors placed directly on the surface of the brain, detecting signals in areas responsible for articulating words. But a team of researchers at the University of Texas at Austin has taken a different approach, aiming to “decode more freeform thought,” according to Marcel Just, a psychology professor at Carnegie Mellon University who was not involved in the study.

The study involved three people who spent up to 16 hours each in a functional MRI scanner. Participants wore headphones that streamed audio from podcasts. The MRI data, which detects activity across the brain, was then sent to a computer. The system learned to match specific patterns of brain activity with certain streams of words, and the computer then attempted to reconstruct the stories from each participant’s brain activity.

The system was also able to paraphrase words a person had imagined saying. In a separate experiment, participants watched videos that told a story without using words. 

“We didn’t tell the subjects to try to describe what’s happening,” said Alexander Huth, an author of the study and an assistant professor of neuroscience and computer science at the University of Texas at Austin. “And yet what we got was this kind of language description of what’s going on in the video.”

While an experimental communication system being developed by Dr. Edward Chang’s team at the University of California, San Francisco for paralyzed patients may be faster and more accurate than the current MRI-based system, the latter does not require surgery, making it a promising alternative. 

The ethical implications of potentially being able to read someone’s thoughts also raise concerns, with researchers emphasizing the importance of ensuring that the technology remains solely in the control of the user.

See more on startup investing from Benzinga:

Market News and Data brought to you by Benzinga APIs
Comments
Loading...
Posted In:
Benzinga simplifies the market for smarter investing

Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.

Join Now: Free!