Friday, November 21, 2025

Brain Device that Reads Inner Thoughts Aloud Inspires Strategies to Protect Mental Privacy

Must read

Brain Device that Reads Inner Thoughts Aloud Inspires Strategies to Protect Mental Privacy

By ETBYANNIKA INAMPUDI

Researchers isolated signals from a brain implant so people with movement disorders could voice thoughts without trying to speak

From counting the number of people in a room to reading this sentence in your head, we are constantly harnessing our ability for inner speech. But what happens when scientists make that internal monologue external? In a paper published today in Cell, researchers describe using brain implants and new data analysis techniques to isolate the neural signals associated with the inner speech of four people with movement disorders that limit their speech, creating a brain-computer interface (BCI) that can vocalize these inner thoughts. The team also explored ways to prevent the system from blurting out internal thoughts a person might want to keep to themselves.

The paper is “impactful” in elucidating how inner speech is produced, says Vikash Gilja, a computer scientist at the University of California (UC) San Diego who has worked on speech-decoding BCIs but was not involved with this work. “This is a paper my whole team is going to pay attention to.”

Over the past decade, engineers have created increasingly sophisticated computer systems that generate words from the brain activity of people paralyzed by strokes or neurological conditions such as amyotrophic lateral sclerosis (ALS). A few months ago, researchers at UC Davis unveiled a device that could decode speech with startling accuracy and speed from electrodes implanted in the motor cortex of a man unable to speak intelligibly because of ALS.

To power this machine, the man had to try as hard as possible to form words with his mouth—to induce signals of “attempted speech” that scientists could pick up from the electrodes. In similar experiments performed at Stanford University, electrical engineer Erin Kunz heard from participants that trying to vocalize their thoughts was tiring and uncomfortable.

People find generating inner speech less stressful, but researchers had previously decoded only a small repertoire of possible words from those brain signals. Kunz and her team sought to expand on that work. First, the researchers analyzed recordings from electrodes that had been implanted in the brains of four people with movement disorders, in a part of the motor cortex known to be involved in attempted speech. The researchers looked for an overlap in the patterns of brain activity involved in two situations: when participants tried to read words aloud and when they simply thought the words to themselves. The team found that neurons in this region were active during both attempted and inner speech, although inner speech produced lower amplitude signals.

Focusing on the areas associated with inner speech, Kunz and her colleagues developed an algorithm that could translate the signals into audio. They trained their model by asking participants to “internally say” words from a 125,000-word vocabulary bank while recording neural activity. Then, the participants were told to imagine saying sentences using those words. The model could decode the sentences based on inner speech in real time in real time, with an error rate from 26% to 54%—the highest accuracy for decoding inner speech reported to date.

The study is a great exploration of the neural mechanisms behind inner speech, although the high error rate and the low number of participants mean it will require more work to refine the technology, says Stephanie Riès, a neuroscientist at San Diego State University who studies language production.

Scientists are far from being able to read your mind without permission, Kunz says. “We’re focused on helping people … who have issues with speaking,” she says, noting the current work is only able to decode inner speech with a constrained vocabulary. But the study also aimed to address the “ethical frontier” of speech-decoding technologies and create safeguards for people with movement disorders who turn to these devices.

To that end, Kunz and her colleagues tested strategies to prevent speech BCIs from accidentally blurting out inner speech. One approach was to prevent all neural data associated with inner speech from being transmitted to the algorithm, allowing only attempted speech to be decoded. Another was to set a special keyword a user could think—in this case, “chitty chitty bang bang”—which cued the device to start to decode inner speech. For one participant, the system was nearly 99% accurate in determining when they wanted to enter this inner-speech mode.

Scientists have long pontificated about the problem of mental privacy with speech BCIs, but this paper marks the first time someone has demonstrated a technique for protecting that privacy, Gilja says. “Ultimately, our goal is to restore communication, but only the communication that a person actually intends.”

It’s important to develop safeguards as we enter an age of “brain transparency,” where people’s private thoughts are increasingly decodable by researchers, adds Nita Farahany, a legal scholar who has written about the ethics of emerging neural technologies. “I found it incredibly gratifying to see researchers taking that seriously,” she says. “It’s a great step forward.”

Using inner speech to communicate may not be for everyone, Kunz says, but this tool offers a new option for people pioneering the use and testing of speech BCIs. “Similar to picking out a car, everyone has different features that they want,” she says. “I think that’s going to be the case for BCIs as well.”

 

Original source: https://www.science.org/content/article/brain-device-reads-inner-thoughts-aloud-inspires-strategies-protect-mental-privacy

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article