Sunday, May 10, 2026

Reading the Eloquent Brain Regions

Telling the difference between inner and outer speech.

As we understand more about how the brain works, we can decode what is going on inside. But sometimes, the result is oversharing! A recent paper discussed how one lab is filtering out inner speech from intentional outer-directed speech in their BCI: brain-computer-interface. 

After a very long road of brain research, we understand a great deal of its computational mechanism. There is no soul, no mystery other than the amazingly intricate interdigitation of billions of neurons, painstakingly architected during evolution to make a rapid learning and response machine. A recent paper described progress on a brain-computer interface, whereby people with brain damage can be helped to communicate from undamaged portions. They found that unspoken thoughts were also detectable, in addition to the attempted speech they were trying to help their subjects express. This led the researchers to provide a switchable interface where the subject could turn that part off or on, at will!

Updated view of speech production areas in the brain. v stands for ventral, PrCG stands for precentral gyrus, the general motor control area. Wernicke's area is known to participate in general language construction / comprehension.

The human facility for speech is unique among animals, and while many animals vocalize and communicate with some complexity, none have been found with the richness of human speech. As a recent evolutionary innovation, there have been some rough edges, but we have quickly become utterly dependent on speech for all aspects of social life. In the brain, speech production happens in motor areas, which reside in a vertical stripe of the cortex forward of the sensory map. It had been thought that Broca's area was the main region and was purely a motor conduit, constructing from the sensory and higher regions the signals sent to innervate the larynx, tongue, lips, and other vocal apparatus. But it turns out to be more complicated. More of the pre-central gyrus (the motor cortex) is involved, and also, when one records from these regions, one gets many kinds of signals, including speech perception, not just production, and other aspects of audio perception and premotor thoughts about communication- phonological working memory, silent rehearsal, etc. Indeed, Broca's area appears to have higher-level roles more in sentence construction rather than in detailed motor control and articulation.

The current authors took off from these recent insights and placed high-density electrodes (about 60 electrodes in each of several locations) up and down the pre-central gyrus in patients that had been rendered inarticulate due to stroke or ALS. Even though this sampling is unbelievably crude compared to the brain's own density of computation, it was enough to generate recognizable speech.

"We discovered that inner speech is robustly represented and demonstrated a proof-of-concept real-time inner-speech BCI that can decode self-paced imagined sentences from a large vocabulary (125,000 words). We also found that aspects of free-form inner speech could be decoded even during tasks where participants were not explicitly instructed to use inner speech. ... To prevent unintended output during inner-speech BCI use, we also demonstrate a system where an internally spoken “keyword” can be detected with high accuracy, enabling a user to “lock” and “unlock” the system."


Outline of the experiments. Electrode micro-arrays were implanted in several place up and down the motor area, or precentral gyrus (C). The computer, using large language models to make sense of the array signals, then interpreted as the user was prompted (or not prompted) for attempted speech.

Given enough computer power, even these patchy signals from the subject's brains could be processed into something useful. The researchers found that the same algorithms that recover attempted speech also detect silent speech, reading, and even listening. This implies perhaps that we listen sympathetically, struggling in our speech production areas when a speaker struggles, and are prone to following / copying other speakers. Or else it just reiterates that these areas are not pure motor control regions, but participate in a more integrated way in the hearing/thinking/speaking circuit. Fortunately, the signals differ significantly in amplitude, so can be told apart. The accuracy of all this is not great, a shown below, but it is surely better than being locked in.

The results (C) of interpretation of inner speech, as opposed to more strongly attempted speech, are far from perfect, but give some indication of what the subject is thinking about.