Scientists Develop Thought Decoding Algorithm Capable of Retelling Stories Imagined in the Mind.
Previous methods of reading thoughts often required implanting electrodes deep within the brain. Alexander Huth, a neuroscientist at the University of Texas at Austin, along with colleagues, has developed a new thought-reading method based on a non-invasive brain scanning technique called functional magnetic resonance imaging (fMRI), as reported by Science Alert on October 24. The new research has been published on the bioRxiv database.
Illustration: Andrew Ostrovsky
fMRI tracks the flow of oxygen-rich blood through the brain. Since active brain cells require more energy and oxygen, information from fMRI provides an indirect measure of brain activity.
Such scanning methods were previously unable to capture real-time brain activity, as the electrical signals emitted by brain cells move much faster than the blood flow in the brain. However, the research team discovered that they could still use this method to decode the semantics of people’s thoughts, even if they could not translate each word specifically.
In the new study, the team scanned the brains of a woman and two men in their 20s and 30s. Each participant listened to different audio files and radio programs totaling 16 hours, divided into multiple sessions. The experts then provided the scans to a computer algorithm known as a “decoder.” The decoder compared sound pattern forms with recorded brain activity.
Next, the algorithm could use an fMRI record to generate a story based on that content. The story closely matched the original plot of the audio file or radio program, Huth stated. Thus, the decoder could infer the story that each person had heard based on their brain activity.
However, the algorithm also made some errors, such as changing the pronouns of characters and the use of first and third person perspectives. “It knows quite accurately what is happening, but it does not know exactly who is doing it,” Huth said.
In additional tests, the algorithm accurately interpreted the plot of a silent film that the participants had watched in the scanner. The algorithm could even recount the story that participants imagined they would tell in their minds. In the long term, the research team aims to develop new technology for brain-machine interfaces for individuals who are unable to speak or type.