Scientists have invented a language decoder that can translate a person’s thoughts into text using an artificial intelligence (AI) transformer similar to ChatGPT, a new study has said. The breakthrough marks the first time that continuous language has been non-invasively reconstructed from human brain activities, which are read through a functional magnetic resonance imaging (fMRI) machine, according to a report in Vice.com.
The report said that the decoder interpreted the gist of stories that human subjects watched or listened to, or even imagined, using fMRI brain patterns. The report said this is an achievement that essentially allows it to read peoples’ minds with unprecedented efficacy.
While this technology is still in its early stages, scientists hope it might one day help people with neurological conditions that affect speech to clearly communicate with the outside world.
However, the team that made the decoder also warned that brain-reading platforms could eventually have nefarious applications, including as a means of surveillance for governments and employers. Though the researchers emphasized that their decoder requires the cooperation of human subjects to work, they argued that “brain–computer interfaces should respect mental privacy,” according to a study published on Monday in Nature Neuroscience.
“Currently, language-decoding is done using implanted devices that require neurosurgery, and our study is the first to decode continuous language, meaning more than full words or sentences, from non-invasive brain recordings, which we collect using functional MRI,” said Jerry Tang, a graduate student in computer science at the University of Texas at Austin who led the study, in a press briefing held last Thursday.
The novel approach allowed the team to push the limits of mind-reading technologies by seeing if the decoder could translate the thoughts of the participants as they watched silent movies, or just imagined stories in their heads. In both cases, the decoder was able to decipher what the participants were seeing, in the case of the movies, and what subjects were thinking as they played out brief stories in their imaginations.
The decoder produced more accurate results during the tests with audio recordings, compared to the imagined speech, but it was still able to glean some basic details of unspoken thoughts from the brain activity. For instance, when a subject envisioned the sentence “went on a dirt road through a field of wheat and over a stream and by some log buildings,” the decoder produced text that said “he had to walk across a bridge to the other side and a very large building in the distance.”
Also Read: Draped In Black, Isha Ambani Stuns Fashionistas At Met Gala 2023