Researchers have long recognized that brain computer interface (BCI) could offer a solution for patients who, paralyzed due to injury or disease, cannot communicate through speech or typing.
Researchers have long recognized that brain computer interface (BCI) could offer a solution for patients who, paralyzed due to injury or disease, cannot communicate through speech or typing. By monitoring brain activity at its source, these neural interfaces translate the patterns of the brain’s electrical activity into control signals for assistive communication devices. The goal of these applications is to restore as much communication speed and accuracy as possible while also posing the lowest possible risk and burden to the patient.
The typical speed benchmark for these applications is expressed as either words per minute (WPM) or characters per minute (CPM); 1 WPM equals 5 CPM. For reference, able-bodied typing speeds are approximately 38-40 words per minute (WPM), or 190-200 characters per minute (CPM). Of course, the ideal system would enable effortless communication at the speed at which a person speaks; English speakers in the United States converse at an approximate rate of 150 WPM, or 750 CPM.
Developed in the 1980s, the first assistive communication BCI prototypes measured brain activity via sensors on the scalp (i.e., electroencephalogram or EEG). Researchers reported that users of these “speller” systems could indeed spell out words, but at the frustratingly slow rate of 2.3 CPM. Still, these early experiments functioned as proof of concept for assistive communication BCI.
Over time, non-invasive BCI technology used in the research setting has demonstrated incremental improvement but have generally demonstrated communication speeds less than 10 WPM (50 CPM). Further, use of these systems requires extremely focused engagement with the task and intense visual attention.
Similar performance can be achieved without BCI by using gaze-tracking methods alone. In one study utilizing a novel gaze-tracking approach (without BCI), five participants with amyotrophic lateral sclerosis (ALS) achieved an average typing speed of 9.51 WPM (47.5 CPM). Note that in addition to being rather slow, gaze-tracking presents other downsides including eye strain and double-vision.
Whereas early systems measured brain activity through sensors on the scalp, the new millennium brought assistive communication interfaces that either sit directly on the brain’s surface (ECoG) or are inserted into the cortical brain (microelectrodes). These applications have focused on three main communication modes:
In 2006, researchers announced the success of a “neural cursor”, an intracortical microelectrode system that recorded signals from the brain’s motor cortex and interpreted those signals to guide a computer cursor. A 2015 study used the same type of intracortical device but with an improved algorithm interpreting the neural data. Subjects achieved substantially increased typing speeds than previously-reported cursor-controlled typing. Then, a more recent study, again using implanted microelectrodes, demonstrated further performance advances with an average typing speed of 28.1 CPM (5.6 WPM) with a maximum of 39.2 CPM (7.84 WPM).
Another cursor-control study took a hybrid approach, combining gaze-tracking and the Stentrode BCI from Synchron; rather than being inserted into the surface of the cortical brain, this stent-like device is placed into a blood vessel in the brain. In this study the two participants achieved 13.81 CPM (2.7 WPM) and 20.1 CPM (4.02 WPM).
Compared to their noninvasive predecessors, these “neural cursor” interfaces offer an impressive degree of control. Though often faster and more accurate than EEG interfaces, these systems remain dramatically slower than the speed at which a person may speak or type. An additional downside of these systems is that they occupy the user’s visual attention.
Researchers have also developed an interface that decodes imagined handwriting. A 2021 study utilized two microelectrode arrays (200 electrodes) implanted into the area of the motor cortex that controls the hand. The paralyzed subject imagined writing letters with a pen, and an autocorrect algorithm was applied to the resulting brain signals. This system achieved 90 CPM (18 WPM) with greater than 99% accuracy, the fastest speed of any published study up to this time and far exceeding speeds of eye-gaze tracking and other non-invasive BCI applications. These results demonstrate an invasive BCI application that does not require the use of vision and has greater communication speed, providing an advantage over non-invasive BCIs.
In another approach, a user imagines that they’re talking while electrodes record from brain areas that control muscles involved in speech. An algorithm then decodes this brain data, with the goal of transforming it into a transcript of the imagined dialogue. This type of speech decoding is especially challenging because over 100 muscles from the mouth, tongue, and vocal tract articulators are utilized in natural speech production.
A recent study demonstrated the ability to decode intended speech in a paralyzed individual using output from an ECoG array at a rate of 15.2 WPM; this system had a 25.6% error rate when decoding sentences formed from a 50-word list. An additional ECoG speech decoding experiment and a study with intracranial electrodes implanted at multiple depths both suggest that neural interfaces that could target individual neurons (i.e. intracortical microelectrodes) could produce greater accuracy in speech decoding.
The studies summarized above demonstrate that it is feasible to translate the brain’s neural activity into some form of communication using cursor-controlled typing, imagined handwriting, and speech decoding. However, the practical performance of these devices has been limited by speed and accuracy.
Furthermore, these applications have been confined to the laboratory setting and make use of electrodes that are approved only for short-term implantions. As such, they remain impractical for use among the people who need them most. Paradromics is now working on technology capable of overcoming these limitations.
With over 1600 electrodes in direct contact with the cortical brain, our Connexus Direct Data Interface will have the ability to collect a dramatically richer stream of brain data than previous systems. Researchers have posited that collecting more precise recording from more individual neurons will translate to communication applications that are both speedier and more accurate than their predecessors. Additionally, the Paradromics technology is being designed for long-term implantation. Together, these advancements will yield an assistive communication device that can produce results not just in a lab but improve the autonomy and quality of life of everyday users.
To learn more about the area of BCI for assistive communication, listen to this episode of the Nurotech Pub podcast: “What We’ve Got Here is Failure to Communicate”
Angrick M, Ottenhoff MC, Diener L, et al. Real-time synthesis of imagined speech processes from minimally invasive recordings of neural activity. Commun Biol. 2021; 4(1):1055.
Anumanchipalli GK, Chartier J, Chang EF. Speech synthesis from neural decoding of spoken sentences. Nature. 2019; 568: 493-510.
Farwell LA, Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol. 1988;70(6):510-523.
Hochberg LR, Serruya MD, Friehs GM, et al. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature. 2006;442(7099):164-171.
Jarosiewicz B, Sarma AA, Bacher D, et al. Virtual typing by people with tetraplegia using a self-calibrating intracortical brain-computer interface. Sci Transl Med. 2015;7(313):313ra179.
Kawala-Sterniuk A, Browarska N, Al-Bakri A, et al. Summary of over Fifty Years with Brain-Computer Interfaces-A Review. Brain Sci. 2021;11(1):43.
Moses DA, Metzger SL, Liu JR, Anumanchipalli GK, Makin JG, Sun PF, Chartier J. "Neuroprosthesis for Decoding Speech in a Paralyzed Person with Anarthria." The New England Journal of Medicine. 2021; 385 (3): 217-227.
Mott ME, Williams S, Wobbrock JO, Morris MR. Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times. Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM. 2017; 2558-2570.
Pandarinath C, Nuyujukian P, Blabe CH, Sorice BL, Saab J, Willett FR, Hochberg LR, Shenoy KV, Henderson JM. High performance communication by people with paralysis using an intracortical brain-computer interface. Elife. 2017 Feb 21;6.
Willett FR., Avansino DT, Hochberg LR, Henderson JM, Shenoy KV. High-performance brain-to-text communication via handwriting. Nature. 2021; (593): 249-254.