homehome Home chatchat Notifications


Computer implant translates paralyzed man's brain activity directly into words and sentences

Brain waves that normally control the patient's vocal tract have been converted into entire sentences on a computer screen.

Tibi Puiu
July 16, 2021 @ 1:38 pm

share Share

Picture of neurosurgeon Edward Chang who was involved in a groundbreaking brain-computer system that allowed a paralyzed man to express his thoughts at 15 words a minute. Credit: Barbara Ries/UCSF.

A brainstem stroke following a horrible car crash left a 20-year-old man paralyzed, robbing him of speech. Eighteen years after his dreadful accident, the man is now able to communicate with the outside world thanks to a medical implant that converts brain waves into sentences on a computer. Although this is just a proof-of-concept, the research is extremely promising, suggesting it may one day be possible to restore sophisticated communication abilities to people who became speech impaired because of an injury.

“Most of us take for granted how easily we communicate through speech,” Dr. Edward Chang, a neurosurgeon at the University of California, San Francisco, told the Associated Press. “It’s exciting to think we’re at the very beginning of a new chapter, a new field.”

People who are paralyzed and have a speech disability have very limited options for communication. The patient mentioned in this new research, for instance, would communicate using a pointer attached to a baseball cap in a pecking motion on a touchscreen to type words or letters. Other patients who might not even be able to use their necks rely on devices that track eye movements and translate them into a cursor movement to select words or letters on a computer screen.

While these options allow paralyzed patients a semblance of connection with the outside world, they’re painfully slow. This is where brain-computer interfaces come in. Their jaw-dropping ability to transform neural activity into an actionable potential has been impressive, to say the least.

These include implants that transform the thoughts of a patient imagining they were writing a sentence by hand with a pen into the actual sentence on a computer screen. Brain-computer interfaces can also be used by paralyzed patients to control mechanical arms, exoskeletons, and even drones. Such interfaces can also facilitate a telepathic-like exchange of information between two people.

Rather than making a mind-controlled prosthetic, Chang and colleagues’ work centers on a neuroprosthetic for speech. The device converts brainwaves that normally control the subtle movements of the lips, jaw, tongue, and larynx to form sounds into words or entire sentences on a computer screen.

After implanting electrodes on the surface of the patient’s brain area responsible for controlling speech, the computer algorithm was trained with neural patterns as the man attempted to say common words such as “water” or “good”. The training took place over the course of 50 sessions spaced over almost two years.

The algorithm was thus taught to associated specific brain wave patterns with 50 words that could be used to form over 1,000 sentences. Previously, Chen’s lab had spent years mapping the brain’s areas responsible for speech, so they had a lot of experience.

For instance, when prompted with questions like ‘How are you today?’ or ‘Are you thirsty?‘ the man answered ‘Am very good’ or ‘No, I am not thirsty’ using the text-based communication enabled by the device that read his thoughts.

It takes three to four seconds for the words imagined by the patient to appear on the computer screen. That’s not nearly as fast as speaking but still much faster than tapping out a response, the researchers explained in a paper published in the New England Journal of Medicine.

The prototype could be refined and turned into a device that helps people with injuries, strokes, or illnesses like Lou Gehrig’s disease that interferes with the delivery of messages from the brain to the vocal tract.

The researchers plan on improving the speed, accuracy, and vocabulary size of their algorhythm. The goal is to have a device that generates voice rather than text on a screen.

share Share

How Hot is the Moon? A New NASA Mission is About to Find Out

Understanding how heat moves through the lunar regolith can help scientists understand how the Moon's interior formed.

This 5,500-year-old Kish tablet is the oldest written document

Beer, goats, and grains: here's what the oldest document reveals.

A Huge, Lazy Black Hole Is Redefining the Early Universe

Astronomers using the James Webb Space Telescope have discovered a massive, dormant black hole from just 800 million years after the Big Bang.

Did Columbus Bring Syphilis to Europe? Ancient DNA Suggests So

A new study pinpoints the origin of the STD to South America.

The Magnetic North Pole Has Shifted Again. Here’s Why It Matters

The magnetic North pole is now closer to Siberia than it is to Canada, and scientists aren't sure why.

For better or worse, machine learning is shaping biology research

Machine learning tools can increase the pace of biology research and open the door to new research questions, but the benefits don’t come without risks.

This Babylonian Student's 4,000-Year-Old Math Blunder Is Still Relatable Today

More than memorializing a math mistake, stone tablets show just how advanced the Babylonians were in their time.

Sixty Years Ago, We Nearly Wiped Out Bed Bugs. Then, They Started Changing

Driven to the brink of extinction, bed bugs adapted—and now pesticides are almost useless against them.

LG’s $60,000 Transparent TV Is So Luxe It’s Practically Invisible

This TV screen vanishes at the push of a button.

Couple Finds Giant Teeth in Backyard Belonging to 13,000-year-old Mastodon

A New York couple stumble upon an ancient mastodon fossil beneath their lawn.