Key idea: The research is a long way off from practical use, but researchers hope it might one day aid communication for people who experienced brain injuries.
Original author and publication date: Will Sullivan (Smithsonian) – September 14, 2022
Futurizonte Editor’s Note: If AI can read our brain, AI can predict much more than just what we were listening to.
From the article:
Scientists are trying to use artificial intelligence to translate brain activity into language.
An A.I. program analyzed snippets of brain activity from people who were listening to recorded speech. It tried to match these brainwaves to a long list of possible speech segments that the person may have heard, writes Science News’ Jonathan Moens.
The algorithm produced its prediction of the ten most likely possibilities, and over 70 percent of the time, its top-ten lists contained the correct answer.
The study, conducted by a team at Facebook’s parent company, Meta, was posted in August to the preprint server arXiv and has not been peer reviewed yet.
In the past, much of the work to decode speech from brain activity has relied on invasive methods that require surgery, writes Jean-Rémi King, a Meta A.I. researcher and a neuroscientist at the École Normale Supérieure in France, in a blog post. In the new research, scientists used brain activity measured with non-invasive technology.
The findings currently have limited practical implications, per New Scientist’s Matthew Sparkes. But the researchers hope to one day help people who can’t communicate by talking, typing or gesturing, such as patients who have suffered severe brain injuries, King writes in the blog post.
Most existing techniques to help these people communicate involve risky brain surgeries, per Science News.