{"title":"E2FNet:基于脑电图和肌电图的手部运动意图识别融合网络","authors":"Guoqian Jiang;Kunyu Wang;Qun He;Ping Xie","doi":"10.1109/JSEN.2024.3471894","DOIUrl":null,"url":null,"abstract":"In light of the growing population of individuals with limb disorders, there is an increasing need to address the challenges they face in their daily lives. Existing rehabilitation technologies, often relying on single physiological signals and plagued by poor signal quality, have limitations in their effectiveness. To overcome these constraints, we present E2FNet, a multimodal physiological information fusion network designed for motor intent recognition in individuals with limb disorders. This study involved eight healthy participants who recorded electromyography (EMG) and electroencephalography (EEG) signals during various hand movements. E2FNet utilizes a multiscale convolutional neural network to extract features from EEG and EMG data, focusing on information fusion across different scales. We also introduce a cross-attention mechanism to capture cross-modal information interactions, enhancing EEG and EMG information fusion. Through extensive experiments, E2FNet achieved an impressive 92.08% classification accuracy, and the effectiveness of each module has been verified. Multiscale separable convolution and cross-attention significantly improved EEG and EMG signal fusion, enhancing accuracy and robustness in motion intent recognition. This research promises to enhance the quality of life and independence of individuals with movement disorders, while also advancing the field of rehabilitation robotics and assistive technology.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"24 22","pages":"38417-38428"},"PeriodicalIF":4.3000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"E2FNet: An EEG- and EMG-Based Fusion Network for Hand Motion Intention Recognition\",\"authors\":\"Guoqian Jiang;Kunyu Wang;Qun He;Ping Xie\",\"doi\":\"10.1109/JSEN.2024.3471894\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In light of the growing population of individuals with limb disorders, there is an increasing need to address the challenges they face in their daily lives. Existing rehabilitation technologies, often relying on single physiological signals and plagued by poor signal quality, have limitations in their effectiveness. To overcome these constraints, we present E2FNet, a multimodal physiological information fusion network designed for motor intent recognition in individuals with limb disorders. This study involved eight healthy participants who recorded electromyography (EMG) and electroencephalography (EEG) signals during various hand movements. E2FNet utilizes a multiscale convolutional neural network to extract features from EEG and EMG data, focusing on information fusion across different scales. We also introduce a cross-attention mechanism to capture cross-modal information interactions, enhancing EEG and EMG information fusion. Through extensive experiments, E2FNet achieved an impressive 92.08% classification accuracy, and the effectiveness of each module has been verified. Multiscale separable convolution and cross-attention significantly improved EEG and EMG signal fusion, enhancing accuracy and robustness in motion intent recognition. This research promises to enhance the quality of life and independence of individuals with movement disorders, while also advancing the field of rehabilitation robotics and assistive technology.\",\"PeriodicalId\":447,\"journal\":{\"name\":\"IEEE Sensors Journal\",\"volume\":\"24 22\",\"pages\":\"38417-38428\"},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2024-10-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Sensors Journal\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10706790/\",\"RegionNum\":2,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10706790/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
E2FNet: An EEG- and EMG-Based Fusion Network for Hand Motion Intention Recognition
In light of the growing population of individuals with limb disorders, there is an increasing need to address the challenges they face in their daily lives. Existing rehabilitation technologies, often relying on single physiological signals and plagued by poor signal quality, have limitations in their effectiveness. To overcome these constraints, we present E2FNet, a multimodal physiological information fusion network designed for motor intent recognition in individuals with limb disorders. This study involved eight healthy participants who recorded electromyography (EMG) and electroencephalography (EEG) signals during various hand movements. E2FNet utilizes a multiscale convolutional neural network to extract features from EEG and EMG data, focusing on information fusion across different scales. We also introduce a cross-attention mechanism to capture cross-modal information interactions, enhancing EEG and EMG information fusion. Through extensive experiments, E2FNet achieved an impressive 92.08% classification accuracy, and the effectiveness of each module has been verified. Multiscale separable convolution and cross-attention significantly improved EEG and EMG signal fusion, enhancing accuracy and robustness in motion intent recognition. This research promises to enhance the quality of life and independence of individuals with movement disorders, while also advancing the field of rehabilitation robotics and assistive technology.
期刊介绍:
The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following:
-Sensor Phenomenology, Modelling, and Evaluation
-Sensor Materials, Processing, and Fabrication
-Chemical and Gas Sensors
-Microfluidics and Biosensors
-Optical Sensors
-Physical Sensors: Temperature, Mechanical, Magnetic, and others
-Acoustic and Ultrasonic Sensors
-Sensor Packaging
-Sensor Networks
-Sensor Applications
-Sensor Systems: Signals, Processing, and Interfaces
-Actuators and Sensor Power Systems
-Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting
-Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data)
-Sensors in Industrial Practice