Wearable Brain-Machine Interface Turns Intentions Into Actions

Updated: Sep 24, 2021

A new wearable brain-machine interface (BMI) framework could work on improving the quality of life of people with paralysis or motor disorders, even those battling with – when an individual is completely cognizant however unfit to move or communicate. A new brain-machine interface permits wearers to remotely control a wheelchair or mechanical arm by just envisioning an activity. The neuroprosthesis could assist with improving the quality of life for those with disabilities. A multi-institutional, worldwide group of scientists led by the lab of Woon-Hong Yeo at the Georgia Institute of Technology consolidated wireless soft scalp electronics and virtual reality in a BMI system that allows the user to imagine an action and wirelessly control a wheelchair or robotic arm.


The group, which included researchers from the University of Kent (United Kingdom) and Yonsei University (Republic of Korea), portrays the new motor imagery-based BMI framework this month in the journal Advanced Science. Yeo, associate professor on the George W. Woodruff School of Mechanical Engineering stated that the significant benefit of this framework to the client, contrasted with what right now exists, is that it is soft and comfortable to wear, and is wireless. BMI frameworks are a recovery innovation that examines an individual's brain signals and makes an interpretation of that neural movement into commands, transforming intentions into actions. The most widely recognized non-intrusive technique for acquiring those signs is ElectroEncephaloGraphy, EEG, which commonly requires an awkward cathode skull cap and a tangled snare of wires.


ElectroEncephaloGraphy, EEG

These gadgets by and large depend intensely on gels and glues to assist with keeping in touch with the skin, require extensive set-up times, are for the most part inconvenient and uncomfortable to use. The gadgets additionally regularly experience poor signal acquisition due to material degradation or motion artifacts the ancillary “noise” which may be caused by something like teeth grinding or eye blinking. This commotion appears in brain data and should be sifted out. The compact EEG framework Yeo designed, incorporating impalpable microneedle electrodes with delicate wireless circuits, offers improved signal acquisition. Precisely measuring those mind signals is basic to figure out what activities a client needs to perform, so the group incorporated an incredible AI calculation and augmented reality segment to address that test.


Even though this device hasn’t been tested with disabled individuals yet, the new system was tested with four human subjects.“This is just a first demonstration, but we’re thrilled with what we have seen,” noted Yeo, Director of Georgia Tech’s Center for Human-Centric Interfaces and Engineering under the Institute for Electronics and Nanotechnology, and a member of the Petit Institute for Bioengineering and Bioscience. Yeo’s team originally introduced a soft, wearable EEG brain-machine interface in a 2019 study published in Nature Machine Intelligence. The lead author of that work, Musa Mahmood, was also the lead author of the team’s new research paper.


Mahmood, a Ph. D. understudy in Yeo's lab stated that this new BMI utilizes a completely unique paradigm, including envisioned motor activities, such as grasping with either hand, which liberates the subject from taking a gander at an excess of stimuli. In the 2021 research, users showed accurate control of virtual reality practices utilizing their thoughts – their motor imagery. The viewable prompt upgrades the process for both the user and the scientists gathering data.“The virtual prompts have proven to be very helpful,” Yeo said. “They speed up and improve user engagement and accuracy. And we were able to record continuous, high-quality motor imagery activity.”According to Mahmood, future work on the system will focus on optimizing electrode placement and more advanced integration of stimulus-based EEG, using what they’ve learned from the last two studies.


ReferencesWireless Soft Scalp Electronics and Virtual Reality System for Motor Imagery-Based Brain–Machine Interfaces” by Woon-Hong Yeo et al. Advanced Science

6,239 views0 comments