通过指内视觉重建柔软机器人触感

IF 6.8 Q1 AUTOMATION & CONTROL SYSTEMS Advanced intelligent systems (Weinheim an der Bergstrasse, Germany) Pub Date : 2024-07-17 DOI:10.1002/aisy.202400022
Ning Guo, Xudong Han, Shuqiao Zhong, Zhiyuan Zhou, Jian Lin, Fang Wan, Chaoyang Song
{"title":"通过指内视觉重建柔软机器人触感","authors":"Ning Guo,&nbsp;Xudong Han,&nbsp;Shuqiao Zhong,&nbsp;Zhiyuan Zhou,&nbsp;Jian Lin,&nbsp;Fang Wan,&nbsp;Chaoyang Song","doi":"10.1002/aisy.202400022","DOIUrl":null,"url":null,"abstract":"<p>Incorporating authentic tactile interactions into virtual environments presents a notable challenge for the emerging development of soft robotic metamaterials. In this study, a vision-based approach is introduced to learning proprioceptive interactions by simultaneously reconstructing the shape and touch of a soft robotic metamaterial (SRM) during physical engagements. The SRM design is optimized to the size of a finger with enhanced adaptability in 3D interactions while incorporating a see-through viewing field inside, which can be visually captured by a miniature camera underneath to provide a rich set of image features for touch digitization. Employing constrained geometric optimization, the proprioceptive process with aggregated multi-handles is modeled. This approach facilitates real-time, precise, and realistic estimations of the finger's mesh deformation within a virtual environment. Herein, a data-driven learning model is also proposed to estimate touch positions, achieving reliable results with impressive <i>R</i><sup>2</sup> scores of 0.9681, 0.9415, and 0.9541 along the <i>x</i>, <i>y</i>, and <i>z</i> axes. Furthermore, the robust performance of the proposed methods in touch-based human–cybernetic interfaces and human–robot collaborative grasping is demonstrated. In this study, the door is opened to future applications in touch-based digital twin interactions through vision-based soft proprioception.</p>","PeriodicalId":93858,"journal":{"name":"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)","volume":"6 10","pages":""},"PeriodicalIF":6.8000,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/aisy.202400022","citationCount":"0","resultStr":"{\"title\":\"Reconstructing Soft Robotic Touch via In-Finger Vision\",\"authors\":\"Ning Guo,&nbsp;Xudong Han,&nbsp;Shuqiao Zhong,&nbsp;Zhiyuan Zhou,&nbsp;Jian Lin,&nbsp;Fang Wan,&nbsp;Chaoyang Song\",\"doi\":\"10.1002/aisy.202400022\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Incorporating authentic tactile interactions into virtual environments presents a notable challenge for the emerging development of soft robotic metamaterials. In this study, a vision-based approach is introduced to learning proprioceptive interactions by simultaneously reconstructing the shape and touch of a soft robotic metamaterial (SRM) during physical engagements. The SRM design is optimized to the size of a finger with enhanced adaptability in 3D interactions while incorporating a see-through viewing field inside, which can be visually captured by a miniature camera underneath to provide a rich set of image features for touch digitization. Employing constrained geometric optimization, the proprioceptive process with aggregated multi-handles is modeled. This approach facilitates real-time, precise, and realistic estimations of the finger's mesh deformation within a virtual environment. Herein, a data-driven learning model is also proposed to estimate touch positions, achieving reliable results with impressive <i>R</i><sup>2</sup> scores of 0.9681, 0.9415, and 0.9541 along the <i>x</i>, <i>y</i>, and <i>z</i> axes. Furthermore, the robust performance of the proposed methods in touch-based human–cybernetic interfaces and human–robot collaborative grasping is demonstrated. In this study, the door is opened to future applications in touch-based digital twin interactions through vision-based soft proprioception.</p>\",\"PeriodicalId\":93858,\"journal\":{\"name\":\"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)\",\"volume\":\"6 10\",\"pages\":\"\"},\"PeriodicalIF\":6.8000,\"publicationDate\":\"2024-07-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/aisy.202400022\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/aisy.202400022\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advanced intelligent systems (Weinheim an der Bergstrasse, Germany)","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/aisy.202400022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

在虚拟环境中融入真实的触觉交互,是软机器人超材料新兴发展面临的一个显著挑战。本研究引入了一种基于视觉的方法,通过在物理接触过程中同时重建软机器人超材料(SRM)的形状和触感来学习本体感觉交互。SRM 的设计经过优化,只有手指大小,在三维互动中具有更强的适应性,同时内部还包含一个透视视场,可通过下方的微型摄像头进行视觉捕捉,为触摸数字化提供丰富的图像特征。利用受限几何优化,对具有聚合多手柄的本体感觉过程进行了建模。这种方法有助于在虚拟环境中对手指的网格变形进行实时、精确和逼真的估计。在此,还提出了一种数据驱动的学习模型来估计触摸位置,取得了可靠的结果,在 x、y 和 z 轴上的 R2 分数分别为 0.9681、0.9415 和 0.9541,令人印象深刻。此外,还证明了所提出的方法在基于触摸的人机交互界面和人机协作抓取中的稳健性能。这项研究为未来通过基于视觉的软本体感觉在基于触摸的数字孪生互动中的应用打开了大门。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Reconstructing Soft Robotic Touch via In-Finger Vision

Incorporating authentic tactile interactions into virtual environments presents a notable challenge for the emerging development of soft robotic metamaterials. In this study, a vision-based approach is introduced to learning proprioceptive interactions by simultaneously reconstructing the shape and touch of a soft robotic metamaterial (SRM) during physical engagements. The SRM design is optimized to the size of a finger with enhanced adaptability in 3D interactions while incorporating a see-through viewing field inside, which can be visually captured by a miniature camera underneath to provide a rich set of image features for touch digitization. Employing constrained geometric optimization, the proprioceptive process with aggregated multi-handles is modeled. This approach facilitates real-time, precise, and realistic estimations of the finger's mesh deformation within a virtual environment. Herein, a data-driven learning model is also proposed to estimate touch positions, achieving reliable results with impressive R2 scores of 0.9681, 0.9415, and 0.9541 along the x, y, and z axes. Furthermore, the robust performance of the proposed methods in touch-based human–cybernetic interfaces and human–robot collaborative grasping is demonstrated. In this study, the door is opened to future applications in touch-based digital twin interactions through vision-based soft proprioception.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.30
自引率
0.00%
发文量
0
审稿时长
4 weeks
期刊最新文献
Masthead A Flexible, Architected Soft Robotic Actuator for Motorized Extensional Motion Design and Optimization of a Magnetic Field Generator for Magnetic Particle Imaging with Soft Magnetic Materials High-Performance Textile-Based Capacitive Strain Sensors via Enhanced Vapor Phase Polymerization of Pyrrole and Their Application to Machine Learning-Assisted Hand Gesture Recognition Optimized Magnetically Docked Ingestible Capsules for Non-Invasive Refilling of Implantable Devices
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1