New! Sign up for our email newsletter on Substack.

Wearable Brain-Machine Interface Turns Intentions into Actions

A new wearable brain-machine interface (BMI) system could improve the quality of life for people with motor dysfunction or paralysis, even those struggling with locked-in syndrome โ€“ when a person is fully conscious but unable to move or communicate.

A multi-institutional, international team of researchers led by the lab ofย Woon-Hong Yeoย at the Georgia Institute of Technology combined wireless soft scalp electronics and virtual reality in a BMI system that allows the user to imagine an action and wirelessly control a wheelchair or robotic arm.

The team, which included researchers from the University of Kent (United Kingdom) and Yonsei University (Republic of Korea), describes the new motor imagery-based BMI system this month in the journalย Advanced Science.

โ€œThe major advantage of this system to the user, compared to what currently exists, is that it is soft and comfortable to wear, and doesnโ€™t have any wires,โ€ said Yeo, associate professor on theย George W. Woodruff School of Mechanical Engineering.

BMI systems are a rehabilitation technology that analyzes a personโ€™s brain signals and translates that neural activity into commands, turning intentions into actions. The most common non-invasive method for acquiring those signals is ElectroEncephaloGraphy, EEG, which typically requires a cumbersome electrode skull cap and a tangled web of wires.

These devices generally rely heavily on gels and pastes to help maintain skin contact, require extensive set-up times, are generally inconvenient and uncomfortable to use. The devices also often suffer from poor signal acquisition due to material degradation or motion artifacts โ€“ the ancillary โ€œnoiseโ€ which may be caused by something like teeth grinding or eye blinking. This noise shows up in brain-data and must be filtered out.

The portable EEG system Yeo designed, integrating imperceptible microneedle electrodes with soft wireless circuits, offers improved signal acquisition. Accurately measuring those brain signals is critical to determining what actions a user wants to perform, so the team integrated a powerful machine learning algorithm and ย virtual reality component to address that challenge.

The new system was tested with four human subjects, but hasnโ€™t been studied with disabled individuals yet.

โ€œThis is just a first demonstration, but weโ€™re thrilled with what we have seen,โ€ noted Yeo, Director of Georgia Techโ€™sย Center for Human-Centric Interfaces and Engineeringย under the Institute for Electronics and Nanotechnology, and a member of theย Petit Institute for Bioengineering and Bioscience.

New Paradigm

Yeoโ€™s team originally introduced soft, wearable EEG brain-machine interface in aย 2019 study published in theย Nature Machine Intelligence.ย The lead author of that work, Musa Mahmood, was also the lead author of the teamโ€™s new research paper.

โ€œThis new brain-machine interface uses an entirely different paradigm, involving imagined motor actions, such as grasping with either hand, which frees the subject from having to look at too much stimuli,โ€ said Mahmood, a Ph. D. student in Yeoโ€™s lab.

In the 2021 study, users demonstrated accurate control of virtual reality exercises using their thoughts โ€“ their motor imagery. The visual cues enhance the process for both the user and the researchers gathering information.

โ€œThe virtual prompts have proven to be very helpful,โ€ Yeo said. โ€œThey speed up and improve user engagement and accuracy. And we were able to record continuous, high-quality motor imagery activity.โ€

According to Mahmood, future work on the system will focus on optimizing electrode placement and more advanced integration of stimulus-based EEG, using what theyโ€™ve learned from the last two studies.

This research was supported by the National Institutes of Health (NIH R21AG064309), the Center Grant (Human-Centric Interfaces and Engineering) at Georgia Tech, the National Research Foundation of Korea (NRF-2018M3A7B4071109 and NRF-2019R1A2C2086085) and Yonsei-KIST Convergence Research Program. Georgia Tech has a pending patent application related to the work described in this paper.

Citation: Musa Mahmood, et al., โ€œWireless Soft Scalp Electronics and Virtual Reality System for Motor Imagery-based Brain-Machine Interfaces.โ€ (Advanced Science, July 2021)

Links

Woon-Hong Yeo

โ€œWireless Soft Scalp Electronics and Virtual Reality System for Motor Imagery-based Brain-Machine Interfaces.โ€

Fuel Independent Science Reporting: Make a Difference Today

If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Independent journalism requires time, effort, and resourcesโ€”your support ensures we can keep uncovering the stories that matter most to you.

Join us in making knowledge accessible and impactful. Thank you for standing with us!