Brain-computer interface guides speech-disabled person’s intended words to computer screen

Using a brain-computer interface, a clinical trial participant who lost the ability to speak was able to create text on a computer at rates that approach the speed of regular speech just by thinking of saying the words.

PROVIDENCE, R.I. [Brown University] — Scientists with the BrainGate research collaborative have reached a major milestone in restoring speech for people who have lost the ability to speak due to paralysis.

In a new study published in Nature, the researchers describe using sensors implanted in areas of the cerebral cortex associated with speech to accurately turn the brain activity of a patient with ALS who lost the ability to speak into words on a screen just by the patient thinking of saying them.

The clinical trial participant — who can no longer use the muscles of her lips, tongue, larynx and jaws to enunciate units of sound clearly — was able to generate 62 words per minute on a computer screen simply by attempting to speak. This is more than three times as fast as the previous record for assisted communication using implanted brain-computer interfaces (BCIs) and begins to approach the roughly 160-word-per-minute rate of natural conversation among English speakers.

The study, published on Wednesday, Aug. 23, shows that it’s possible to use neural activity to decode attempted speaking movements with better speed and a larger vocabulary than what was previously possible.

“This is a scientific proof of concept, not an actual device people can use in everyday life,” said Frank Willett, one of the study’s lead authors and a research scientist at Stanford University and the Howard Hughes Medical Institute. “It’s a big advance toward restoring rapid communication to people with paralysis who can’t speak.”

The work is part of the BrainGate clinical trial, directed by Dr. Leigh Hochberg, a critical care neurologist and a professor at Brown University’s School of Engineering who is affiliated with the University’s Carney Institute for Brain Science. Dr. Jaimie Henderson, a professor of neurosurgery at Stanford, and Krishna Shenoy, a Stanford professor and HHMI investigator, who died before the study was published, were also authors on the study.

The study is the latest in a series of advances in brain-computer interfaces made by the BrainGate consortium, which along with other work using BCIs has been developing systems that enable people to generate text through direct brain control for several years. Previous incarnations have involved trial participants thinking about the motions involved in pointing to and clicking letters on a virtual keyboard, and, in 2021, converting a paralyzed person’s imagined handwriting onto text on a screen, attaining a speed of 18 words per minute.

“With credit and thanks to the extraordinary people with tetraplegia who enroll in the BrainGate clinical trials and other BCI research, we continue to see the incredible potential of implanted brain-computer interfaces to restore communication and mobility,” said Hochberg, who in addition to his roles at Brown is a neurologist at Massachusetts General Hospital and director of the V.A. Rehabilitation Research and Development Center for Neurorestoration and Neurotechnology in Providence.

One of those extraordinary people is Pat Bennett, who having learned about the 2021 work, volunteered for the BrainGate clinical trial that year.

Bennett, now 68, is a former human resources director and daily jogger who was diagnosed with ALS (amyotrophic lateral sclerosis) in 2012. For Bennett, the progressive neurodegenerative disease stole her ability to speak intelligibly. While Bennett’s brain can still formulate directions for generating units of sound called phonemes, her muscles can’t carry out the commands.

As part of the clinical trial, Henderson, a neurosurgeon, placed two pairs of tiny electrodes about the size of a baby aspirin in two separate speech-related regions of Bennett’s cerebral cortex. An artificial-intelligence algorithm receives and decodes electronic information emanating from Bennett’s brain, eventually teaching itself to distinguish the distinct brain activity associated with her attempts to formulate each of the phonemes — such as sh sound — that are the building blocks of speech and compose spoken English.

The decoder then feeds its best guess concerning the sequence of Bennett’s attempted phonemes into a language model, which acts essentially as a sophisticated autocorrect system. This system then converts the streams of phonemes into the sequence of words they represent, which are then displayed on the computer screen.

To teach the algorithm to recognize which brain-activity patterns were associated with which phonemes, Bennett engaged in about 25 training sessions, each lasting about four hours, where she attempted to repeat sentences chosen randomly from a large data set.

As part of these sessions, the research team also analyzed the system’s accuracy. They found that when the sentences and the word-assembling language model were restricted to a 50-word vocabulary, the translation system’s error rate was 9.1%. When vocabulary was expanded to 125,000 words, large enough to compose almost anything someone would want to say, the error rate rose to 23.8%.

The researchers say the figures are far from perfect but represent a giant step forward from prior results using BCIs. They are hopeful of what the system could one day achieve — as is Bennett.

“For those who are nonverbal, this means they can stay connected to the bigger world, perhaps continue to work, maintain friends and family relationships,” Bennett wrote via email. “Imagine how different conducting everyday activities like shopping, attending appointments, ordering food, going into a bank, talking on a phone, expressing love or appreciation — even arguing — will be when nonverbal people can communicate their thoughts in real time.”

The research was funded by the National Institutes of Health (U01-DC017844 and U01-DC019430), the U.S. Department of Veterans Affairs, Howard Hughes Medical Institute, the Simons Foundation and L. and P. Garlick.

This story was adapted from a news release published by Bruce Goldman at Stanford Medicine.

CAUTION: Investigational device. Limited by federal law to investigational use.