Detecting insertion tasks using convolutional neural networks during robot teaching-by-demonstration

Etienne Roberge, Vincent Duchaine
{"title":"Detecting insertion tasks using convolutional neural networks during robot teaching-by-demonstration","authors":"Etienne Roberge, Vincent Duchaine","doi":"10.1109/IROS.2017.8206154","DOIUrl":null,"url":null,"abstract":"Today, collaborative robots are often taught new tasks through “teaching by demonstration” techniques rather than manual programming. This works well for many tasks; however, some tasks like precise tight-fitting insertions can be hard to recreate through exact position replays because they also involve forces and are highly affected by the robot's repeatability and the position of the object in the hand. As of yet there is no way to automatically detect when procedures to reduce position uncertainty should be used. In this paper, we present a new way to automatically detect insertion tasks during impedance control-based trajectory teaching. This is accomplished by recording the forces and torques applied by the operator and inputting these signals to a convolutional neural network. The convolutional neural network is used to extract important features of the spatio-temporal forces and torque signals for distinguishing insertion tasks. Eventually, this method could help robots understand the tasks they are taught at a higher level. They will not only be capable of a position-time replay of the task, but will also recognize the best strategy to apply in order to accomplish the task (in this case insertion). Our method was tested on data obtained from 886 experiments that were conducted on eight different in-hand objects. Results show that we can distinguish insertion tasks from pick-and-place tasks with an average accuracy of 82%.","PeriodicalId":6658,"journal":{"name":"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"11 1","pages":"3210-3216"},"PeriodicalIF":0.0000,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.2017.8206154","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Today, collaborative robots are often taught new tasks through “teaching by demonstration” techniques rather than manual programming. This works well for many tasks; however, some tasks like precise tight-fitting insertions can be hard to recreate through exact position replays because they also involve forces and are highly affected by the robot's repeatability and the position of the object in the hand. As of yet there is no way to automatically detect when procedures to reduce position uncertainty should be used. In this paper, we present a new way to automatically detect insertion tasks during impedance control-based trajectory teaching. This is accomplished by recording the forces and torques applied by the operator and inputting these signals to a convolutional neural network. The convolutional neural network is used to extract important features of the spatio-temporal forces and torque signals for distinguishing insertion tasks. Eventually, this method could help robots understand the tasks they are taught at a higher level. They will not only be capable of a position-time replay of the task, but will also recognize the best strategy to apply in order to accomplish the task (in this case insertion). Our method was tested on data obtained from 886 experiments that were conducted on eight different in-hand objects. Results show that we can distinguish insertion tasks from pick-and-place tasks with an average accuracy of 82%.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
机器人示范教学中使用卷积神经网络检测插入任务
今天,协作机器人通常通过“示范教学”技术而不是手动编程来教授新任务。这对于许多任务都很有效;然而,有些任务,比如精确的紧身插入,很难通过精确的位置回放来重现,因为它们也涉及到力,并且受到机器人的可重复性和物体在手中的位置的高度影响。到目前为止,还没有办法自动检测何时应该使用减少位置不确定性的程序。本文提出了一种基于阻抗控制的轨迹教学中插入任务自动检测的新方法。这是通过记录操作员施加的力和扭矩并将这些信号输入到卷积神经网络来完成的。利用卷积神经网络提取时空力和扭矩信号的重要特征,用于区分插入任务。最终,这种方法可以帮助机器人理解更高层次的任务。它们不仅能够在位置时间重播任务,而且还能够识别为完成任务而应用的最佳策略(在本例中为插入)。我们的方法是在886个实验中得到的数据上进行测试的,这些实验是在8个不同的手持物体上进行的。结果表明,我们可以区分插入任务和拾取任务,平均准确率为82%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Direct visual SLAM fusing proprioception for a humanoid robot Upper limb motion intent recognition using tactile sensing Soft fluidic rotary actuator with improved actuation properties Underwater 3D structures as semantic landmarks in SONAR mapping Adaptive perception: Learning from sensory predictions to extract object shape with a biomimetic fingertip
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1