A New Robotic Grasp Detection Method based on RGB-D Deep Fusion*

Hao Ma, Ding Yuan, Qingke Wang, Hong Zhang
{"title":"A New Robotic Grasp Detection Method based on RGB-D Deep Fusion*","authors":"Hao Ma, Ding Yuan, Qingke Wang, Hong Zhang","doi":"10.1109/RCAR54675.2022.9872259","DOIUrl":null,"url":null,"abstract":"Grasping is one of the most widely used tasks of robots. The application of computer vision can improve robot intelligence. Previous methods simply treated the problem of robotic grasping detection similar to object detection, which ignores the characteristics of the grasping problem, leading to a loss of accuracy. Additionally, treating depth images equally with RGBs is unreasonable. This study proposes a new grasp detection model using an RGB-D deep fusion module that combines multi-scale RGB and depth features. An adaptive anchor box-setting method based on a two-step approximation was designed. With the network-sharing structures of target and grasp detection, the target category and appropriate grasp posture can be obtained end-to-end. Experiments show that compared with other models, ours achieves significant improvement in accuracy while maintaining real-time computing performance.","PeriodicalId":304963,"journal":{"name":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Real-time Computing and Robotics (RCAR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RCAR54675.2022.9872259","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Grasping is one of the most widely used tasks of robots. The application of computer vision can improve robot intelligence. Previous methods simply treated the problem of robotic grasping detection similar to object detection, which ignores the characteristics of the grasping problem, leading to a loss of accuracy. Additionally, treating depth images equally with RGBs is unreasonable. This study proposes a new grasp detection model using an RGB-D deep fusion module that combines multi-scale RGB and depth features. An adaptive anchor box-setting method based on a two-step approximation was designed. With the network-sharing structures of target and grasp detection, the target category and appropriate grasp posture can be obtained end-to-end. Experiments show that compared with other models, ours achieves significant improvement in accuracy while maintaining real-time computing performance.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于RGB-D深度融合的机器人抓握检测新方法*
抓取是机器人应用最广泛的任务之一。计算机视觉的应用可以提高机器人的智能。以往的方法将机器人抓取检测问题简单地处理为物体检测问题,忽略了抓取问题的特性,导致精度的损失。此外,将深度图像与rgb同等对待是不合理的。本文提出了一种结合多尺度RGB和深度特征的RGB- d深度融合模块抓握检测模型。设计了一种基于两步逼近的自适应锚盒设置方法。利用目标和抓握检测的网络共享结构,可以端到端得到目标类别和合适的抓握姿态。实验表明,与其他模型相比,我们的模型在保持实时计算性能的同时,精度得到了显著提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Depth Recognition of Hard Inclusions in Tissue Phantoms for Robotic Palpation Design of a Miniaturized Magnetic Actuation System for Motion Control of Micro/Nano Swimming Robots Energy Shaping Based Nonlinear Anti-Swing Controller for Double-Pendulum Rotary Crane with Distributed-Mass Beams RCAR 2022 Cover Page Design and Implementation of Robot Middleware Service Integration Framework Based on DDS
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1