
A few months ago, engineers from the University of California, Davis, introduced a device that allowed a patient with ALS (Amyotrophic Lateral Sclerosis), who couldn’t speak clearly, to “talk” using brain implants. Electrodes were placed in the motor cortex — the part of the brain responsible for speech-related movements. The patient had to put in a great effort to try articulating words for the system to work, which was extremely tiring.
Scientists at Stanford University took a different approach. Using the same type of brain implants, they first asked four volunteers to read words aloud and then to imagine saying those words silently. While recording brain activity with an electroencephalogram (EEG), they found that even imagined speech activates the motor cortex, though at a lower intensity.
Using artificial intelligence, they trained the system to recognize phonemes — the smallest units of speech — by comparing real and imagined speech. The AI then reconstructed words and sentences using a 125,000-word dictionary.
Participants were asked to mentally say full sentences using this prepared vocabulary. The system decoded this "inner speech" in real time with an accuracy ranging from 26% to 54% — the best result ever achieved.
The study, published in Cell, is a significant step forward in understanding the neural mechanisms behind internal speech. However, further improvements are still needed before this technology can be widely used in medicine.
Keywords