High-precision dynamic gesture recognition based on microfiber sensor and EMT-Net

IF 4.1 3区 工程技术 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC Sensors and Actuators A-physical Pub Date : 2024-09-11 DOI:10.1016/j.sna.2024.115852
{"title":"High-precision dynamic gesture recognition based on microfiber sensor and EMT-Net","authors":"","doi":"10.1016/j.sna.2024.115852","DOIUrl":null,"url":null,"abstract":"<div><p>Dynamic gesture recognition, which utilizes flexible wearable sensors and deep learning, is invaluable for human–computer interaction. Nevertheless, the primary challenges persist are the rapid detection of intricate gestures and the accurate recognition of dynamic signals. In this study, we suggest utilizing a microfiber sensor to identify the variations in wrist skin and detect dynamic gesture. In order to tackle the issue of insufficient feature extraction in the detected signals, resulting in reduced accuracy in recognition, we introduce a network dubbed EMT-Net (improve multi-head attention transformer network). This network utilizes a transformer encoder to capture and represent the characteristics of dynamic gesture signal and uses a CNN to classify the encoded features. To ensure that the model comprehensively captures the temporal and statistical characteristics of the signals, we enhance the multi-head attention mechanism by restricting certain attention heads to concentrate solely on the statistical features of the signals while allowing others to focus on the temporal features and global dependencies. Furthermore, because of the varying discriminatory abilities of different characteristics, we have developed an attention module to redistribution the attention weights on statistical features. The experimental results demonstrate that the microfiber sensor effectively recognizes ten distinct forms of dynamic gesture signals. Simultaneously, EMT-Net attains proficient identification with an accuracy of 98.80%, precision of 98.81%, recall of 98.80%, and an F1 score of 98.80%. The application value of this dynamic gesture recognition technology, which utilizes microfiber sensors and EMT-Net, is significant. The forthcoming alterations in human–computer interaction, virtual reality, and various other domains are anticipated.</p></div>","PeriodicalId":21689,"journal":{"name":"Sensors and Actuators A-physical","volume":null,"pages":null},"PeriodicalIF":4.1000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sensors and Actuators A-physical","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S092442472400846X","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Dynamic gesture recognition, which utilizes flexible wearable sensors and deep learning, is invaluable for human–computer interaction. Nevertheless, the primary challenges persist are the rapid detection of intricate gestures and the accurate recognition of dynamic signals. In this study, we suggest utilizing a microfiber sensor to identify the variations in wrist skin and detect dynamic gesture. In order to tackle the issue of insufficient feature extraction in the detected signals, resulting in reduced accuracy in recognition, we introduce a network dubbed EMT-Net (improve multi-head attention transformer network). This network utilizes a transformer encoder to capture and represent the characteristics of dynamic gesture signal and uses a CNN to classify the encoded features. To ensure that the model comprehensively captures the temporal and statistical characteristics of the signals, we enhance the multi-head attention mechanism by restricting certain attention heads to concentrate solely on the statistical features of the signals while allowing others to focus on the temporal features and global dependencies. Furthermore, because of the varying discriminatory abilities of different characteristics, we have developed an attention module to redistribution the attention weights on statistical features. The experimental results demonstrate that the microfiber sensor effectively recognizes ten distinct forms of dynamic gesture signals. Simultaneously, EMT-Net attains proficient identification with an accuracy of 98.80%, precision of 98.81%, recall of 98.80%, and an F1 score of 98.80%. The application value of this dynamic gesture recognition technology, which utilizes microfiber sensors and EMT-Net, is significant. The forthcoming alterations in human–computer interaction, virtual reality, and various other domains are anticipated.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于微纤维传感器和 EMT-Net 的高精度动态手势识别
动态手势识别利用灵活的可穿戴传感器和深度学习技术,对人机交互具有重要价值。然而,快速检测复杂的手势和准确识别动态信号仍是主要挑战。在这项研究中,我们建议利用超细纤维传感器来识别手腕皮肤的变化并检测动态手势。为了解决检测信号特征提取不足导致识别准确率降低的问题,我们引入了一个名为 EMT-Net(改进型多头注意力变压器网络)的网络。该网络利用变压器编码器捕捉和表示动态手势信号的特征,并使用 CNN 对编码特征进行分类。为确保模型全面捕捉信号的时间和统计特征,我们增强了多头注意力机制,限制某些注意力头只关注信号的统计特征,而让其他注意力头关注时间特征和全局依赖性。此外,由于不同特征的分辨能力各不相同,我们还开发了一个注意力模块,以重新分配统计特征上的注意力权重。实验结果表明,超细纤维传感器能有效识别十种不同形式的动态手势信号。同时,EMT-Net 的识别准确率为 98.80%,精确率为 98.81%,召回率为 98.80%,F1 分数为 98.80%。这项利用超细纤维传感器和 EMT-Net 的动态手势识别技术具有重要的应用价值。预计在人机交互、虚拟现实和其他各种领域即将发生变化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Sensors and Actuators A-physical
Sensors and Actuators A-physical 工程技术-工程:电子与电气
CiteScore
8.10
自引率
6.50%
发文量
630
审稿时长
49 days
期刊介绍: Sensors and Actuators A: Physical brings together multidisciplinary interests in one journal entirely devoted to disseminating information on all aspects of research and development of solid-state devices for transducing physical signals. Sensors and Actuators A: Physical regularly publishes original papers, letters to the Editors and from time to time invited review articles within the following device areas: • Fundamentals and Physics, such as: classification of effects, physical effects, measurement theory, modelling of sensors, measurement standards, measurement errors, units and constants, time and frequency measurement. Modeling papers should bring new modeling techniques to the field and be supported by experimental results. • Materials and their Processing, such as: piezoelectric materials, polymers, metal oxides, III-V and II-VI semiconductors, thick and thin films, optical glass fibres, amorphous, polycrystalline and monocrystalline silicon. • Optoelectronic sensors, such as: photovoltaic diodes, photoconductors, photodiodes, phototransistors, positron-sensitive photodetectors, optoisolators, photodiode arrays, charge-coupled devices, light-emitting diodes, injection lasers and liquid-crystal displays. • Mechanical sensors, such as: metallic, thin-film and semiconductor strain gauges, diffused silicon pressure sensors, silicon accelerometers, solid-state displacement transducers, piezo junction devices, piezoelectric field-effect transducers (PiFETs), tunnel-diode strain sensors, surface acoustic wave devices, silicon micromechanical switches, solid-state flow meters and electronic flow controllers. Etc...
期刊最新文献
High-selectivity NIR amorphous silicon-based plasmonic photodetector at room temperature 2D beam steering using phased array of MEMS tunable grating couplers Focus-switchable piezoelectric actuator: A bionic thin-plate design inspired by conch structure Methods of fabrication and modeling of CMUTs – A review Effect of material anisotropy on the first-order vibration of piezoelectric oscillators in circular plate configurations
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1