💬 AI Learned to "Read" Thoughts
Scientists from Stanford have, for the first time, used AI to decipher what people were thinking and saying in their heads.
The experiment's four participants, who had lost the ability to speak after a stroke, had microelectrodes implanted in the motor cortex of the brain. The volunteers were asked to either try to say certain words or imagine saying them. The used vocabulary included about 125,000 words.
It turned out that both tasks activated approximately the same areas of the brain and produced similar types of neural activity. An AI model trained on this data was able to decipher the volunteers' internal monologues with 74% accuracy. At the same time, people were able to express their thoughts at a comfortable conversational rate of about 120–150 words per minute.
Scientists have discovered a method to prevent private thoughts from being expressed. Participants were asked to say a "password" to themselves: chitty chitty bang bang. Upon recognizing the code phrase, the AI immediately stopped transcribing the neural activity.
"This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech," Frank Willett, the study's co-author and an assistant professor of neurosurgery at Stanford University, hopes.


