Scientists translate thoughts to words

It sounds like the stuff of science fiction, but scientists at Utah University recently brought us one step closer to having a true mind-reading machine.

A team of researchers believed the human brain likely creates the same signals whether a person says a word aloud or simply thinks it. Based on this belief, they’re working on a device to translate brain signals into words.

Aside from just being a neat thing to figure out, the practical hope is to be able to develop a new communication tool for individuals who are unable to speak due to severe paralysis, stroke, Lou Gherig’s disease or other trauma or illness. Many people who are currently ‘locked in’ must rely on small facial movements or eye blinks to signal yes or no or painstakingly pick out individual letters in order to communicate.

For this study, the Utah team worked with a volunteer who was a patient with severe epilepsy and had a portion of his skull removed as part of a surgical treatment for his condition. Researchers attached two button sized grids of electrodes to the surface of the speech centres of his brain. During hour-long sessions over four days, the patient repeatedly read 10 simple words chosen because they might be particularly useful for commuicating basic needs: yes, no, hot, cold, hungry, thirsty, hello, goodbye, more and less. A computer recorded the brain signals generated for each word.

Afterward, the patient said the same words aloud and the computer matched the brain signals for each word.

Amazingly, when comparing the signals for any two words, they were matched with between 75 and 90 percent accuracy. Results were less positive if comparing all 10 words simultaneously – with between 28 and 48 percent accuracy. Although still better than chance, it is not accurate enough to be of practical use yet.

Still, according to the lead scientist for this study, this is a proof of concept and an exciting first step. Next steps will involve improving the accuracy and expanding the technology so that a wireless translation device can be made and used by those who need it.

This team believes such a device could be available within two to three years – which could mean a vastly improved quality of life for individuals unable to communicate.

Another interesting finding in this study involved the difference between the two areas of the brain chosen for gathering signals. Researchers focused on the facial motor cortex and a lesser known area also involved in language, Wernicke’s area. Researchers found the facial motor cortex was much more active when the volunteer repeated words, but Wernicke’s area was very active when the volunteer was thanked after repeating words. This showed Wernicke’s area is more involved in high-level understanding of language while the facial motor cortex is involved in controling facial muscles to help produce sounds.

Results from this preliminary study at Utah University were published in September in the Journal of Neural Engineering.

 

Current Studies

 Alzheimer's 

 Crohn's 

 Insomnia

 Migraine

 Narcolepsy

 Parkinson's 

  Ulcerative Colitis

 

 

more

 

 

 Interested in participating? Call us for more information!

more