Turning Thoughts into Speech | God's World News

Turning Thoughts into Speech

07/01/2019
  • 1 Brain Implant
    New technology could give paralyzed people the ability to speak just by thinking words.
  • 2 Brain Implant
    Dr. Gopala Anumanchipalli shows an array of intracranial electrodes like those being used in the study he is involved with. (UCSF)
  • 3 Brain Implant
    With a series of electrodes, an implant allows a computer to read signals from the brain and translate them into spoken language. (UCSF)
  • 4 Brain Implant
    An array of electrodes like this can be placed inside the skull to read electrical signals from the brain. (UCSF)
  • 5 Brain Implant
    Technology is decoding signals picked up from the brain and turning those signals into words. (UCSF)
  • 1 Brain Implant
  • 2 Brain Implant
  • 3 Brain Implant
  • 4 Brain Implant
  • 5 Brain Implant
  • 1 Brain Implant
  • 2 Brain Implant
  • 3 Brain Implant
  • 4 Brain Implant
  • 5 Brain Implant
  • 1 Brain Implant
  • 2 Brain Implant
  • 3 Brain Implant
  • 4 Brain Implant
  • 5 Brain Implant

Every 60 seconds, a healthy human vocal tract can form and pronounce about 150 words. By that count, reading this story aloud would take less than four minutes. But what about those who can’t coordinate the lips, jaw, tongue, and throat to talk? Researchers are working to bypass the usual voice mechanisms—and turn thoughts straight into understandable speech.

Physical conditions like stroke or brain trauma can prevent human speech. God knows our thoughts before we speak them. (Psalm 139:2) But in order to communicate with other humans, non-speaking persons must often use photographs, symbols, sign language, or even high-tech computer apps or eye trackers—devices that use eye movements to move a cursor. However, compared to audible speech, these tools can be cumbersome for both speaker and hearer.

Edward Chang is a neurosurgeon. This spring, Chang and other researchers published a study describing a way to translate thoughts into speech. Their method involves studying human speech backwards.

First, they found patients whose brains were already being tested. With brain electrodes connected to a powerful computer, test subjects read hundreds of sentences aloud. The computer recorded signals from the parts of their brains that control tongue, lip, and throat muscles.

“Very few of us have any real idea of what’s going on in our mouth when we speak,” Chang says. “The brain translates those thoughts of what you want to say into movements of the vocal tract, and that’s what we want to decode.”

The researchers took the signals of each brain to recreate what the mouth and throat needed to do in order to say different words. A computer then simulated each patient’s muscle movements and “re-spoke” the sentences.

Researchers played the computer-spoken sentences for a test group. Listeners wrote down what they heard. Amazingly, nearly half the people understood most sentences correctly—especially when they used a prepared list of possible words.

There were some hiccups: Rabbit was heard as rodent and mom stood in for mum. But overall, the tests were successful.

So far, the technology has been tested only on people with typical speech. But neurosurgery professor Jaimie Henderson believes that “within the next 10 years, . . . we’ll be seeing systems that will improve people’s ability to communicate.”

Neurologist Leigh Robert Hochberg wishes change would happen faster than that. Still, he’s pleased with the progress. “I think brain-computer interfaces will have a lot of opportunity to help people,” he says, “and hopefully, to help people quickly.”