U.S. researchers have moved a step closer to being able to interpret or “read” the thoughts of patients who are no longer capable of speaking.
The promising work could lead to the development of electronic devices to help these patients communicate with family, friends and caregivers.
“This is huge for patients who have damage to their speech mechanisms because of a stroke or Lou Gehrig’s disease and can’t speak,” said the study’s senior author, Robert Knight of the University California, Berkeley.
In a paper published this week in the journal PLoS Biology, the researchers reported they have been able to “decode the electrical activity,” in the superior temporal gyrus, a part of the brain’s auditory system.
When a person listens to spoken language, the brain breaks down the sounds into various frequencies and translates the information into specific electrical signals, explained the study’s lead author, Brian Pasley, a postdoctoral researcher at the university. In a similar fashion, the act of speaking also produces a unique pattern of electrical activity.
By gaining a greater understanding of these processes – and learning how to decipher these signals – the researchers believe it may be possible to know what a person is hearing or intends to say simply by monitoring brain activity.
For their study, the researchers enlisted the help of 15 patients who were undergoing exploratory surgery for the treatment of severe epilepsy. The patients had part of their skull removed so that electrodes could be placed directly on the cortex or surface of the brain. They remained in hospital for a week while doctors pinpointed the source of their seizures.
Aside from epilepsy, the patients were essentially healthy and had normal hearing and speech. By agreeing to be part of the study, they provided the researchers with a wealth of information. In fact, each patient had an average of 64 electrodes covering the cortex and one individual had 256 of the sensing devices.
Their brain activity was recorded as they listened to words and segments of conversation. Using this data, Dr. Pasley then created a computer program that could reconstruct and play back the sounds encoded in the electrical activity of the brain. The reproduced sounds were far from perfect, but the work suggests it may be possible to decode these signals in a fairly reliable fashion.
The researchers cautioned that they still have a lot more work to do. “We are looking at the very early stages of a long and complicated process,” Dr. Pasley said. Even so, there is reason to believe that if a person can imagine talking – even if he can’t utter a word – that brain activity can be picked up and interpreted.
“The general principle is that the area in the brain that performs the task is also activated when you imagine doing the task,” Dr. Knight said.
Indeed, earlier studies have shown it is possible to move objects – such as a cursor on a computer screen – with electrodes hooked up to the brain. Just thinking about the action makes it happen.
Similarly, if researchers can decipher the signals governing speech, “you could synthesize the actual sound a person is thinking, or just write out the words with a type of interface device,” Dr. Pasley speculated. The procedure, though, would likely require an invasive operation – with electrodes permanently implanted into the brain.