通过近距离视觉和深度学习探索自主假肢抓取技术

IF 3.4 Q2 ENGINEERING, BIOMEDICAL IEEE transactions on medical robotics and bionics Pub Date : 2024-03-14 DOI:10.1109/TMRB.2024.3377530
E. Mastinu;A. Coletti;J. van den Berg;C. Cipriani
{"title":"通过近距离视觉和深度学习探索自主假肢抓取技术","authors":"E. Mastinu;A. Coletti;J. van den Berg;C. Cipriani","doi":"10.1109/TMRB.2024.3377530","DOIUrl":null,"url":null,"abstract":"The traumatic loss of a hand is usually followed by significant psychological, functional and rehabilitation challenges. Even though much progress has been reached in the past decades, the prosthetic challenge of restoring the human hand functionality is still far from being achieved. Autonomous prosthetic hands showed promising results and wide potential benefit, a benefit that must be still explored and deployed. Here, we hypothesized that a combination of a radar sensor and a low-resolution time-of-flight camera can be sufficient for object recognition in both static and dynamic scenarios. To test this hypothesis, we analyzed via deep learning algorithms HANDdata, a human-object interaction dataset with particular focus on reach-to-grasp actions. Inference testing was also performed on unseen data purposely acquired. The analyses reported here, broken down to gradually increasing levels of complexity, showed a great potential of using such proximity sensors as alternative or complementary solution to standard camera-based systems. In particular, integrated and low-power radar can be a potential key technology for next generation intelligent and autonomous prostheses.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":null,"pages":null},"PeriodicalIF":3.4000,"publicationDate":"2024-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10472622","citationCount":"0","resultStr":"{\"title\":\"Explorations of Autonomous Prosthetic Grasping via Proximity Vision and Deep Learning\",\"authors\":\"E. Mastinu;A. Coletti;J. van den Berg;C. Cipriani\",\"doi\":\"10.1109/TMRB.2024.3377530\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The traumatic loss of a hand is usually followed by significant psychological, functional and rehabilitation challenges. Even though much progress has been reached in the past decades, the prosthetic challenge of restoring the human hand functionality is still far from being achieved. Autonomous prosthetic hands showed promising results and wide potential benefit, a benefit that must be still explored and deployed. Here, we hypothesized that a combination of a radar sensor and a low-resolution time-of-flight camera can be sufficient for object recognition in both static and dynamic scenarios. To test this hypothesis, we analyzed via deep learning algorithms HANDdata, a human-object interaction dataset with particular focus on reach-to-grasp actions. Inference testing was also performed on unseen data purposely acquired. The analyses reported here, broken down to gradually increasing levels of complexity, showed a great potential of using such proximity sensors as alternative or complementary solution to standard camera-based systems. In particular, integrated and low-power radar can be a potential key technology for next generation intelligent and autonomous prostheses.\",\"PeriodicalId\":73318,\"journal\":{\"name\":\"IEEE transactions on medical robotics and bionics\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2024-03-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10472622\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on medical robotics and bionics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10472622/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, BIOMEDICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10472622/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

摘要

创伤性失手通常会带来巨大的心理、功能和康复挑战。尽管在过去几十年中已经取得了很大进展,但恢复人手功能的假肢挑战仍然远未实现。自主假手显示出良好的效果和广泛的潜在益处,但这一益处仍有待探索和应用。在此,我们假设雷达传感器和低分辨率飞行时间照相机的组合足以在静态和动态场景中识别物体。为了验证这一假设,我们通过深度学习算法对 HANDdata 进行了分析,HANDdata 是一个人-物交互数据集,重点关注伸手抓取动作。我们还对特意获取的未见数据进行了推理测试。本文所报告的分析表明,随着复杂程度的逐步提高,使用此类接近传感器作为基于摄像头的标准系统的替代或补充解决方案的潜力巨大。特别是,集成式低功耗雷达可以成为下一代智能和自主义肢的潜在关键技术。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Explorations of Autonomous Prosthetic Grasping via Proximity Vision and Deep Learning
The traumatic loss of a hand is usually followed by significant psychological, functional and rehabilitation challenges. Even though much progress has been reached in the past decades, the prosthetic challenge of restoring the human hand functionality is still far from being achieved. Autonomous prosthetic hands showed promising results and wide potential benefit, a benefit that must be still explored and deployed. Here, we hypothesized that a combination of a radar sensor and a low-resolution time-of-flight camera can be sufficient for object recognition in both static and dynamic scenarios. To test this hypothesis, we analyzed via deep learning algorithms HANDdata, a human-object interaction dataset with particular focus on reach-to-grasp actions. Inference testing was also performed on unseen data purposely acquired. The analyses reported here, broken down to gradually increasing levels of complexity, showed a great potential of using such proximity sensors as alternative or complementary solution to standard camera-based systems. In particular, integrated and low-power radar can be a potential key technology for next generation intelligent and autonomous prostheses.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
6.80
自引率
0.00%
发文量
0
期刊最新文献
Table of Contents IEEE Transactions on Medical Robotics and Bionics Society Information Guest Editorial Special section on the Hamlyn Symposium 2023—Immersive Tech: The Future of Medicine IEEE Transactions on Medical Robotics and Bionics Publication Information IEEE Transactions on Medical Robotics and Bionics Information for Authors
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1