Recalling of multiple grasping methods from an object image with a convolutional neural network

IF 1.5 Q3 INSTRUMENTS & INSTRUMENTATION ROBOMECH Journal Pub Date : 2020-08-13 DOI:10.21203/rs.3.rs-51700/v1
M. Sanada, T. Matsuo, N. Shimada, Y. Shirai
{"title":"Recalling of multiple grasping methods from an object image with a convolutional neural network","authors":"M. Sanada, T. Matsuo, N. Shimada, Y. Shirai","doi":"10.21203/rs.3.rs-51700/v1","DOIUrl":null,"url":null,"abstract":"In this study, a method for a robot to recall multiple grasping methods for a given object is proposed. The aim of this study was for robots to learn grasping methods for new objects by observing the grasping activities of humans in daily life without special instructions. For this setting, only one grasping motion was observed for an object at a time, and it was never known whether other grasping methods were possible for the object, although supervised learning generally requires all possible answers for each training input. The proposed method gives a solution for that learning situations by employing a convolutional neural network with automatic clustering of the observed grasping method. In the proposed method, the grasping methods are clustered during the process of learning of the grasping position. The method first recalls grasping positions and the network estimates the multi-channel heatmap such that each channel heatmap indicates one grasping position, then checks the graspability for each estimated position. Finally, the method recalls the hand shapes based on the estimated grasping position and the object’s shape. This paper describes the results of recalling multiple grasping methods and demonstrates the effectiveness of the proposed method.","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"8 1","pages":"1-13"},"PeriodicalIF":1.5000,"publicationDate":"2020-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ROBOMECH Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21203/rs.3.rs-51700/v1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0

Abstract

In this study, a method for a robot to recall multiple grasping methods for a given object is proposed. The aim of this study was for robots to learn grasping methods for new objects by observing the grasping activities of humans in daily life without special instructions. For this setting, only one grasping motion was observed for an object at a time, and it was never known whether other grasping methods were possible for the object, although supervised learning generally requires all possible answers for each training input. The proposed method gives a solution for that learning situations by employing a convolutional neural network with automatic clustering of the observed grasping method. In the proposed method, the grasping methods are clustered during the process of learning of the grasping position. The method first recalls grasping positions and the network estimates the multi-channel heatmap such that each channel heatmap indicates one grasping position, then checks the graspability for each estimated position. Finally, the method recalls the hand shapes based on the estimated grasping position and the object’s shape. This paper describes the results of recalling multiple grasping methods and demonstrates the effectiveness of the proposed method.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于卷积神经网络的物体图像抓取方法回顾
在本研究中,提出了一种机器人对给定物体的多种抓取方法进行记忆的方法。本研究的目的是让机器人在没有特殊指导的情况下,通过观察人类在日常生活中的抓取活动来学习抓取新物体的方法。对于这种设置,每次只观察到一个物体的抓取动作,并且永远不知道是否有其他抓取方法可以用于该物体,尽管监督学习通常需要每个训练输入的所有可能答案。该方法利用卷积神经网络对观察到的抓取方法进行自动聚类,解决了这种学习情况。该方法在抓取位置的学习过程中对抓取方法进行聚类。该方法首先回忆抓取位置,然后网络估计多通道热图,使每个通道热图表示一个抓取位置,然后检查每个估计位置的可抓取性。最后,该方法根据估计的抓取位置和物体形状召回手的形状。本文描述了多种抓取方法的召回结果,并证明了所提方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
ROBOMECH Journal
ROBOMECH Journal Mathematics-Control and Optimization
CiteScore
3.20
自引率
7.10%
发文量
21
审稿时长
13 weeks
期刊介绍: ROBOMECH Journal focuses on advanced technologies and practical applications in the field of Robotics and Mechatronics. This field is driven by the steadily growing research, development and consumer demand for robots and systems. Advanced robots have been working in medical and hazardous environments, such as space and the deep sea as well as in the manufacturing environment. The scope of the journal includes but is not limited to: 1. Modeling and design 2. System integration 3. Actuators and sensors 4. Intelligent control 5. Artificial intelligence 6. Machine learning 7. Robotics 8. Manufacturing 9. Motion control 10. Vibration and noise control 11. Micro/nano devices and optoelectronics systems 12. Automotive systems 13. Applications for extreme and/or hazardous environments 14. Other applications
期刊最新文献
Computer vision-based visualization and quantification of body skeletal movements for investigation of traditional skills: the production of Kizumi winnowing baskets Measuring unit for synchronously collecting air dose rate and measurement position Length control of a McKibben pneumatic actuator using a dynamic quantizer Interactive driving of electrostatic film actuator by proximity motion of human body Development and flight-test verification of two-dimensional rotational low-airspeed sensor for small helicopters
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1