{"title":"An attention-based motor imagery brain-computer interface system for lower limb exoskeletons.","authors":"Xinzhi Ma, Weihai Chen, Zhongcai Pei, Jing Zhang","doi":"10.1063/5.0243337","DOIUrl":null,"url":null,"abstract":"<p><p>Lower-limb exoskeletons have become increasingly popular in rehabilitation to help patients with disabilities regain mobility and independence. Brain-computer interface (BCI) offers a natural control method for these exoskeletons, allowing users to operate them through their electroencephalogram (EEG) signals. However, the limited EEG decoding performance of the BCI system restricts its application for lower limb exoskeletons. To address this challenge, we propose an attention-based motor imagery BCI system for lower limb exoskeletons. The decoding module of the proposed BCI system combines the convolutional neural network (CNN) with a lightweight attention module. The CNN aims to extract meaningful features from EEG signals, while the lightweight attention module aims to capture global dependencies among these features. The experiments are divided into offline and online experiments. The offline experiment is conducted to evaluate the effectiveness of different decoding methods, while the online experiment is conducted on a customized lower limb exoskeleton to evaluate the proposed BCI system. Eight subjects are recruited for the experiments. The experimental results demonstrate the great classification performance of the decoding method and validate the feasibility of the proposed BCI system. Our approach establishes a promising BCI system for the lower limb exoskeleton and is expected to achieve a more effective and user-friendly rehabilitation process.</p>","PeriodicalId":21111,"journal":{"name":"Review of Scientific Instruments","volume":"95 12","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2024-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Review of Scientific Instruments","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1063/5.0243337","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0
Abstract
Lower-limb exoskeletons have become increasingly popular in rehabilitation to help patients with disabilities regain mobility and independence. Brain-computer interface (BCI) offers a natural control method for these exoskeletons, allowing users to operate them through their electroencephalogram (EEG) signals. However, the limited EEG decoding performance of the BCI system restricts its application for lower limb exoskeletons. To address this challenge, we propose an attention-based motor imagery BCI system for lower limb exoskeletons. The decoding module of the proposed BCI system combines the convolutional neural network (CNN) with a lightweight attention module. The CNN aims to extract meaningful features from EEG signals, while the lightweight attention module aims to capture global dependencies among these features. The experiments are divided into offline and online experiments. The offline experiment is conducted to evaluate the effectiveness of different decoding methods, while the online experiment is conducted on a customized lower limb exoskeleton to evaluate the proposed BCI system. Eight subjects are recruited for the experiments. The experimental results demonstrate the great classification performance of the decoding method and validate the feasibility of the proposed BCI system. Our approach establishes a promising BCI system for the lower limb exoskeleton and is expected to achieve a more effective and user-friendly rehabilitation process.
期刊介绍:
Review of Scientific Instruments, is committed to the publication of advances in scientific instruments, apparatuses, and techniques. RSI seeks to meet the needs of engineers and scientists in physics, chemistry, and the life sciences.