Brain-computer interfaces or BCIs hold immense potential for individuals with a wide range of neurological conditions, but the road to implementation is long and nuanced for both the invasive and noninvasive versions of the technology. Bin He of Carnegie Mellon University is highly driven to improve noninvasive BCIs, and his lab uses an innovative electroencephalogram (EEG) wearable to push the boundaries of what's possible. For the first time on record, the group successfully integrated a novel focused ultrasound stimulation to realize bidirectional BCI that both encodes and decodes brain waves using machine learning in a study with 25 human subjects. This work opens up a new avenue to significantly enhance not only the signal quality, but also, overall nonivasive BCI performance by stimulating targeted neural circuits.
Noninvasive BCI is lauded for its merits of being cheap, safe and virtually applicable to everyone, but because signals are recorded over the scalp versus inside the brain, low signal quality presents some limitations. The He group is exploring ways to improve the effectiveness of noninvasive BCIs and over time, has used deep learning approaches to decode what an individual was thinking and then facilitate control of a cursor or robotic arm.
In their latest research, published in Nature Communications, the He group demonstrated that through precision noninvasive neuromodulation using focused ultrasound, the performance of a BCI could be improved for communication.
"This paper reports a breakthrough in noninvasive BCIs by integrating a novel focused ultrasound stimulation to realize bidirectional BCI functionality," explained Bin He, professor of biomedical engineering at Carnegie Mellon University. "Using a communication prosthetic, 25 human subjects spelled out phrases like "Carnegie Mellon" using a BCI speller. Our findings showed that the addition of focused ultrasound neuromodulation significantly boosted the performance of EEG-based BCI. It also elevated theta neural oscillation that enhanced attention and led to enhanced BCI performance."
For context, a BCI speller is a 6x6 visual motion aide containing the entire alpabet that is commonly used by nonspeakers to communicate. In He's study, subjects donned an EEG cap and just by looking at the letters, were able to generate EEG signals to spell the desired words. When a focused ultrasound beam was applied externally to the V5 area (part of the visual cortex) of the brain, the performance of the noninvasive BCI greatly improved among subjects. The neuromodulation-integrated BCI actively altered the engagement of neural circuits to maximize the BCI performance, compared to previous uses, which consisted of pure processing and decoding recorded signals.
"The BRAIN Initiative has supported more than 60 ultrasound projects since its inception. This unique application of noninvasive recording and modulation technologies expands the toolkit, with a potentially scalable impact on assisting people living with communication disabilities," said Dr. Grace Hwang, program director at the Brain Research Through Advancing Innovative Neurotechnologies® initiative (The BRAIN Initiative®) at the National Institutes of Health (NIH).
Following this discovery, the He lab is further investigating the merits and applications of focused ultrasound neuromodulation to the brain, beyond the visual system, to enhance noninvasive BCIs. They also aim to develop more compact-focused ultrasound neuromodulation device for better integration with EEG-based BCIs, and to integrate AI to continue to enhance the overall system performance.
"This is my lifelong interest, and I will never give up," emphasized He. "Working to improve noninvasive technology is difficult, but I strongly believe that if we can find a way to make it work, everyone will benefit. I will keep working, and someday, noninvasive lifesaving technology will be available for every household."