A Novel Multi-Feature Fusion Network With Spatial Partitioning Strategy and Cross-Attention for Armband-Based Gesture Recognition

IF 4.8 2区 医学 Q2 ENGINEERING, BIOMEDICAL IEEE Transactions on Neural Systems and Rehabilitation Engineering Pub Date : 2024-10-28 DOI:10.1109/TNSRE.2024.3487216
Fo Hu;Mengyuan Qian;Kailun He;Wen-An Zhang;Xusheng Yang
{"title":"A Novel Multi-Feature Fusion Network With Spatial Partitioning Strategy and Cross-Attention for Armband-Based Gesture Recognition","authors":"Fo Hu;Mengyuan Qian;Kailun He;Wen-An Zhang;Xusheng Yang","doi":"10.1109/TNSRE.2024.3487216","DOIUrl":null,"url":null,"abstract":"Effectively integrating the time-space-frequency information of multi-modal signals from armband sensor, including surface electromyogram (sEMG) and accelerometer data, is critical for accurate gesture recognition. Existing approaches often neglect the abundant spatial relationships inherent in multi-channel sEMG signals obtained via armband sensors and face challenges in harnessing the correlations across multiple feature domains. To address this issue, we propose a novel multi-feature fusion network with spatial partitioning strategy and cross-attention (MFN-SPSCA) to improve the accuracy and robustness of gesture recognition. Specifically, a spatiotemporal graph convolution module with a spatial partitioning strategy is designed to capture potential spatial feature of multi-channel sEMG signals. Additionally, we design a cross-attention fusion module to learn and prioritize the importance and correlation of multi-feature domain. Extensive experiment demonstrate that the MFN-SPSCA method outperforms other state-of-the-art methods on self-collected dataset and the Ninapro DB5 dataset. Our work addresses the challenge of recognizing gestures from the multi-modal data collected by armband sensor, emphasizing the importance of integrating time-space-frequency information. Codes are available at \n<uri>https://github.com/ZJUTofBrainIntelligence/MFN-SPSCA</uri>\n.","PeriodicalId":13419,"journal":{"name":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","volume":null,"pages":null},"PeriodicalIF":4.8000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10737142","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Neural Systems and Rehabilitation Engineering","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10737142/","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Effectively integrating the time-space-frequency information of multi-modal signals from armband sensor, including surface electromyogram (sEMG) and accelerometer data, is critical for accurate gesture recognition. Existing approaches often neglect the abundant spatial relationships inherent in multi-channel sEMG signals obtained via armband sensors and face challenges in harnessing the correlations across multiple feature domains. To address this issue, we propose a novel multi-feature fusion network with spatial partitioning strategy and cross-attention (MFN-SPSCA) to improve the accuracy and robustness of gesture recognition. Specifically, a spatiotemporal graph convolution module with a spatial partitioning strategy is designed to capture potential spatial feature of multi-channel sEMG signals. Additionally, we design a cross-attention fusion module to learn and prioritize the importance and correlation of multi-feature domain. Extensive experiment demonstrate that the MFN-SPSCA method outperforms other state-of-the-art methods on self-collected dataset and the Ninapro DB5 dataset. Our work addresses the challenge of recognizing gestures from the multi-modal data collected by armband sensor, emphasizing the importance of integrating time-space-frequency information. Codes are available at https://github.com/ZJUTofBrainIntelligence/MFN-SPSCA .
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
采用空间分割策略和交叉注意力的新型多特征融合网络,用于基于臂章的手势识别。
有效整合来自臂带传感器的多模态信号(包括表面肌电图(sEMG)和加速度计数据)的时空频率信息对于准确识别手势至关重要。现有方法往往忽视了通过臂带传感器获取的多通道 sEMG 信号中固有的丰富空间关系,在利用多个特征域的相关性方面面临挑战。为解决这一问题,我们提出了一种具有空间分区策略和交叉注意力的新型多特征融合网络(MFN-SPSCA),以提高手势识别的准确性和鲁棒性。具体来说,我们设计了一个具有空间分割策略的时空图卷积模块,以捕捉多通道 sEMG 信号的潜在空间特征。此外,我们还设计了一个交叉注意力融合模块,以学习和优先处理多特征域的重要性和相关性。大量实验证明,MFN-SPSCA 方法在自收集数据集和 Ninapro DB5 数据集上的表现优于其他先进方法。我们的工作解决了从臂章传感器收集的多模态数据中识别手势的难题,强调了整合时-空-频信息的重要性。代码见 https://github.com/ZJUTofBrainIntelligence/MFN-SPSCA。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
8.60
自引率
8.20%
发文量
479
审稿时长
6-12 weeks
期刊介绍: Rehabilitative and neural aspects of biomedical engineering, including functional electrical stimulation, acoustic dynamics, human performance measurement and analysis, nerve stimulation, electromyography, motor control and stimulation; and hardware and software applications for rehabilitation engineering and assistive devices.
期刊最新文献
Lower Limb Torque Prediction for Sit-To-Walk Strategies Using Long Short-Term Memory Neural Networks. A Novel Multi-Feature Fusion Network With Spatial Partitioning Strategy and Cross-Attention for Armband-Based Gesture Recognition Correction to "Enhancing Detection of Control State for High-Speed Asynchronous SSVEP-BCIs Using Frequency-Specific Framework". Obstacle Avoidance in Healthy Adults and People with Multiple Sclerosis: Preliminary fNIRS Study. Trial-by-Trial Variability of TMS-EEG in Healthy Controls and Patients With Depressive Disorder
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1