一种在真实环境中检测人类运动意图的方法。

Yi-Xing Liu, Zhao-Yuan Wan, Ruoli Wang, Elena M Gutierrez-Farewik
{"title":"一种在真实环境中检测人类运动意图的方法。","authors":"Yi-Xing Liu, Zhao-Yuan Wan, Ruoli Wang, Elena M Gutierrez-Farewik","doi":"10.1109/ICORR58425.2023.10304774","DOIUrl":null,"url":null,"abstract":"<p><p>Accurate and timely movement intention detection can facilitate exoskeleton control during transitions between different locomotion modes. Detecting movement intentions in real environments remains a challenge due to unavoidable environmental uncertainties. False movement intention detection may also induce risks of falling and general danger for exoskeleton users. To this end, in this study, we developed a method for detecting human movement intentions in real environments. The proposed method is capable of online self-correcting by implementing a decision fusion layer. Gaze data from an eye tracker and inertial measurement unit (IMU) signals were fused at the feature extraction level and used to predict movement intentions using 2 different methods. Images from the scene camera embedded on the eye tracker were used to identify terrains using a convolutional neural network. The decision fusion was made based on the predicted movement intentions and identified terrains. Four able-bodied participants wearing the eye tracker and 7 IMU sensors took part in the experiments to complete the tasks of level ground walking, ramp ascending, ramp descending, stairs ascending, and stair descending. The recorded experimental data were used to test the feasibility of the proposed method. An overall accuracy of 93.4% was achieved when both feature fusion and decision fusion were used. Fusing gaze data with IMU signals improved the prediction accuracy.</p>","PeriodicalId":73276,"journal":{"name":"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]","volume":"2023 ","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Method of Detecting Human Movement Intentions in Real Environments.\",\"authors\":\"Yi-Xing Liu, Zhao-Yuan Wan, Ruoli Wang, Elena M Gutierrez-Farewik\",\"doi\":\"10.1109/ICORR58425.2023.10304774\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Accurate and timely movement intention detection can facilitate exoskeleton control during transitions between different locomotion modes. Detecting movement intentions in real environments remains a challenge due to unavoidable environmental uncertainties. False movement intention detection may also induce risks of falling and general danger for exoskeleton users. To this end, in this study, we developed a method for detecting human movement intentions in real environments. The proposed method is capable of online self-correcting by implementing a decision fusion layer. Gaze data from an eye tracker and inertial measurement unit (IMU) signals were fused at the feature extraction level and used to predict movement intentions using 2 different methods. Images from the scene camera embedded on the eye tracker were used to identify terrains using a convolutional neural network. The decision fusion was made based on the predicted movement intentions and identified terrains. Four able-bodied participants wearing the eye tracker and 7 IMU sensors took part in the experiments to complete the tasks of level ground walking, ramp ascending, ramp descending, stairs ascending, and stair descending. The recorded experimental data were used to test the feasibility of the proposed method. An overall accuracy of 93.4% was achieved when both feature fusion and decision fusion were used. Fusing gaze data with IMU signals improved the prediction accuracy.</p>\",\"PeriodicalId\":73276,\"journal\":{\"name\":\"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]\",\"volume\":\"2023 \",\"pages\":\"1-6\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICORR58425.2023.10304774\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE ... International Conference on Rehabilitation Robotics : [proceedings]","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICORR58425.2023.10304774","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在不同运动模式之间的转换过程中,准确及时的运动意图检测可以促进外骨骼的控制。由于不可避免的环境不确定性,在真实环境中检测运动意图仍然是一项挑战。错误的运动意图检测也可能导致外骨骼使用者跌倒和一般危险。为此,在这项研究中,我们开发了一种在真实环境中检测人类运动意图的方法。所提出的方法能够通过实现决策融合层来进行在线自校正。在特征提取水平上融合来自眼睛跟踪器的凝视数据和惯性测量单元(IMU)信号,并使用2种不同的方法预测运动意图。使用卷积神经网络,来自嵌入眼动仪的场景摄像机的图像被用于识别地形。基于预测的运动意图和识别的地形进行决策融合。四名身体健全的参与者佩戴眼动仪和7个IMU传感器参加了实验,以完成平地行走、坡道上升、坡道下降、楼梯上升和楼梯下降的任务。记录的实验数据用于测试所提出方法的可行性。当同时使用特征融合和决策融合时,总体准确率达到93.4%。将注视数据与IMU信号融合提高了预测精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A Method of Detecting Human Movement Intentions in Real Environments.

Accurate and timely movement intention detection can facilitate exoskeleton control during transitions between different locomotion modes. Detecting movement intentions in real environments remains a challenge due to unavoidable environmental uncertainties. False movement intention detection may also induce risks of falling and general danger for exoskeleton users. To this end, in this study, we developed a method for detecting human movement intentions in real environments. The proposed method is capable of online self-correcting by implementing a decision fusion layer. Gaze data from an eye tracker and inertial measurement unit (IMU) signals were fused at the feature extraction level and used to predict movement intentions using 2 different methods. Images from the scene camera embedded on the eye tracker were used to identify terrains using a convolutional neural network. The decision fusion was made based on the predicted movement intentions and identified terrains. Four able-bodied participants wearing the eye tracker and 7 IMU sensors took part in the experiments to complete the tasks of level ground walking, ramp ascending, ramp descending, stairs ascending, and stair descending. The recorded experimental data were used to test the feasibility of the proposed method. An overall accuracy of 93.4% was achieved when both feature fusion and decision fusion were used. Fusing gaze data with IMU signals improved the prediction accuracy.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
0.50
自引率
0.00%
发文量
0
期刊最新文献
Individualized Three-Dimensional Gait Pattern Generator for Lower Limbs Rehabilitation Robots. Individualized Training of Back Muscles Using Iterative Learning Control of a Compliant Balance Board. Influence of Robotic Therapy on Severe Stroke Patients. INSPIIRE - A Modular and Passive Exoskeleton to Investigate Human Walking and Balance. Instrumented Upper Limb Functional Assessment Using a Robotic Exoskeleton: Normative References Intervals.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1