Enhanced EMG-Based Hand Gesture Classification in Real-World Scenarios: Mitigating Dynamic Factors With Tempo-Spatial Wavelet Transform and Deep Learning

IF 3.4 Q2 ENGINEERING, BIOMEDICAL IEEE transactions on medical robotics and bionics Pub Date : 2024-06-03 DOI:10.1109/TMRB.2024.3408896
Parul Rani;Sidharth Pancholi;Vikash Shaw;Manfredo Atzori;Sanjeev Kumar
{"title":"Enhanced EMG-Based Hand Gesture Classification in Real-World Scenarios: Mitigating Dynamic Factors With Tempo-Spatial Wavelet Transform and Deep Learning","authors":"Parul Rani;Sidharth Pancholi;Vikash Shaw;Manfredo Atzori;Sanjeev Kumar","doi":"10.1109/TMRB.2024.3408896","DOIUrl":null,"url":null,"abstract":"Dynamic factors, like limb position changes and electrode shifting, significantly impact the performance of EMG-based hand gesture classification as the transition is made from a laboratory-controlled environment to real-life scenarios. Traditionally, researchers have employed conventional wavelet transform methods to improve classification performance. This study compares a tempo-spatial technique that utilizes the wavelet multiresolution method and compares it with the conventional wavelet transform using eight machine learning algorithms. Two public datasets are utilized. DB1 comprising ideal conditions with a range of limb positions, while DB2 incorporates dynamic factors like electrode shifting and muscle fatigue. The training/testing involves two cases: one using single-position data and other with multiple positions. Results demonstrate that the Deep Neural Network (DNN) classifier outperforms others in classification accuracy. Proposed technique achieves mean accuracies of 84.07% (DB1) and 68.15% (DB2), while conventional wavelet transform methods achieve 79.39% (DB1) and 53.48% (DB2) for single-position DNN training. For multiple positions, particularly two limb positions, the proposed technique achieves mean accuracies of 94.43% (DB1) and 73.79% (DB2), compared to conventional wavelet transform, which achieves 84.38% (DB1) and 51.98% (DB2) with DNN. Paired t-tests (p-value<0.001) show significant improvement over conventional wavelet transformation, indicating the proposed technique’s potential to enhance gesture classification in real-world scenarios.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":"6 3","pages":"1202-1211"},"PeriodicalIF":3.4000,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10547182/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Dynamic factors, like limb position changes and electrode shifting, significantly impact the performance of EMG-based hand gesture classification as the transition is made from a laboratory-controlled environment to real-life scenarios. Traditionally, researchers have employed conventional wavelet transform methods to improve classification performance. This study compares a tempo-spatial technique that utilizes the wavelet multiresolution method and compares it with the conventional wavelet transform using eight machine learning algorithms. Two public datasets are utilized. DB1 comprising ideal conditions with a range of limb positions, while DB2 incorporates dynamic factors like electrode shifting and muscle fatigue. The training/testing involves two cases: one using single-position data and other with multiple positions. Results demonstrate that the Deep Neural Network (DNN) classifier outperforms others in classification accuracy. Proposed technique achieves mean accuracies of 84.07% (DB1) and 68.15% (DB2), while conventional wavelet transform methods achieve 79.39% (DB1) and 53.48% (DB2) for single-position DNN training. For multiple positions, particularly two limb positions, the proposed technique achieves mean accuracies of 94.43% (DB1) and 73.79% (DB2), compared to conventional wavelet transform, which achieves 84.38% (DB1) and 51.98% (DB2) with DNN. Paired t-tests (p-value<0.001) show significant improvement over conventional wavelet transformation, indicating the proposed technique’s potential to enhance gesture classification in real-world scenarios.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
真实世界场景中基于 EMG 的增强型手势分类:利用节奏空间小波变换和深度学习减轻动态因素的影响
当从实验室控制环境过渡到真实场景时,肢体位置变化和电极移动等动态因素会极大地影响基于肌电图的手势分类性能。传统上,研究人员采用传统的小波变换方法来提高分类性能。本研究比较了一种利用小波多分辨率方法的节奏空间技术,并利用八种机器学习算法将其与传统的小波变换进行了比较。本研究使用了两个公共数据集。DB1 包括一系列肢体位置的理想条件,而 DB2 则包含电极移动和肌肉疲劳等动态因素。训练/测试包括两种情况:一种是使用单个位置数据,另一种是使用多个位置数据。结果表明,深度神经网络(DNN)分类器的分类准确性优于其他分类器。在单位置 DNN 训练中,拟议技术的平均准确率为 84.07%(DB1)和 68.15%(DB2),而传统小波变换方法的平均准确率为 79.39%(DB1)和 53.48%(DB2)。对于多位置,尤其是两个肢体位置,与传统小波变换方法相比,所提出的技术达到了 94.43% (DB1) 和 73.79% (DB2)的平均准确率,而传统小波变换方法的 DNN 平均准确率为 84.38% (DB1) 和 51.98% (DB2)。配对 t 检验(p 值<0.001)表明,与传统的小波变换相比,该技术有了显著的改进,这表明所提出的技术具有在真实世界场景中增强手势分类的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.80
自引率
0.00%
发文量
0
期刊最新文献
Table of Contents IEEE Transactions on Medical Robotics and Bionics Society Information Guest Editorial Special section on the Hamlyn Symposium 2023—Immersive Tech: The Future of Medicine IEEE Transactions on Medical Robotics and Bionics Publication Information IEEE Transactions on Medical Robotics and Bionics Information for Authors
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1