A pixel-level grasp detection method based on Efficient Grasp Aware Network

IF 1.9 4区 计算机科学 Q3 ROBOTICS Robotica Pub Date : 2024-09-18 DOI:10.1017/s0263574724001358
Haonan Xi, Shaodong Li, Xi Liu
{"title":"A pixel-level grasp detection method based on Efficient Grasp Aware Network","authors":"Haonan Xi, Shaodong Li, Xi Liu","doi":"10.1017/s0263574724001358","DOIUrl":null,"url":null,"abstract":"This work proposes a novel grasp detection method, the Efficient Grasp Aware Network (EGA-Net), for robotic visual grasp detection. Our method obtains semantic information for grasping through feature extraction. It efficiently obtains feature channel weights related to grasping tasks through the constructed ECA-ResNet module, which can smooth the network’s learning. Meanwhile, we use concatenation to obtain low-level features with rich spatial information. Our method inputs an RGB-D image and outputs the grasp poses and their quality score. The EGA-Net is trained and tested on the Cornell and Jacquard datasets, and we achieve 98.9% and 95.8% accuracy, respectively. The proposed method only takes 24 ms for real-time performance to process an RGB-D image. Moreover, our method achieved better results in the comparison experiment. In the real-world grasp experiments, we use a 6-degree of freedom (DOF) UR-5 robotic arm to demonstrate its robust grasping of unseen objects in various scenes. We also demonstrate that our model can successfully grasp different types of objects without any processing in advance. The experiment results validate our model’s exceptional robustness and generalization.","PeriodicalId":49593,"journal":{"name":"Robotica","volume":null,"pages":null},"PeriodicalIF":1.9000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotica","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1017/s0263574724001358","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

This work proposes a novel grasp detection method, the Efficient Grasp Aware Network (EGA-Net), for robotic visual grasp detection. Our method obtains semantic information for grasping through feature extraction. It efficiently obtains feature channel weights related to grasping tasks through the constructed ECA-ResNet module, which can smooth the network’s learning. Meanwhile, we use concatenation to obtain low-level features with rich spatial information. Our method inputs an RGB-D image and outputs the grasp poses and their quality score. The EGA-Net is trained and tested on the Cornell and Jacquard datasets, and we achieve 98.9% and 95.8% accuracy, respectively. The proposed method only takes 24 ms for real-time performance to process an RGB-D image. Moreover, our method achieved better results in the comparison experiment. In the real-world grasp experiments, we use a 6-degree of freedom (DOF) UR-5 robotic arm to demonstrate its robust grasping of unseen objects in various scenes. We also demonstrate that our model can successfully grasp different types of objects without any processing in advance. The experiment results validate our model’s exceptional robustness and generalization.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于高效抓取感知网络的像素级抓取检测方法
本研究提出了一种用于机器人视觉抓取检测的新型抓取检测方法--高效抓取感知网络(EGA-Net)。我们的方法通过特征提取获得抓取的语义信息。它通过构建的 ECA-ResNet 模块有效地获得与抓取任务相关的特征通道权重,从而使网络的学习更加平滑。同时,我们利用串联法获得了具有丰富空间信息的低层次特征。我们的方法输入 RGB-D 图像并输出抓取姿势及其质量得分。EGA 网络在康奈尔和 Jacquard 数据集上进行了训练和测试,准确率分别达到 98.9% 和 95.8%。所提出的方法处理 RGB-D 图像的实时性仅需 24 毫秒。此外,我们的方法在对比实验中取得了更好的结果。在实际抓取实验中,我们使用了 6 自由度(DOF)的 UR-5 机械臂,展示了它在各种场景中对未知物体的鲁棒抓取能力。我们还证明,我们的模型可以在不预先进行任何处理的情况下成功抓取不同类型的物体。实验结果验证了我们的模型具有卓越的鲁棒性和通用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Robotica
Robotica 工程技术-机器人学
CiteScore
4.50
自引率
22.20%
发文量
181
审稿时长
9.9 months
期刊介绍: Robotica is a forum for the multidisciplinary subject of robotics and encourages developments, applications and research in this important field of automation and robotics with regard to industry, health, education and economic and social aspects of relevance. Coverage includes activities in hostile environments, applications in the service and manufacturing industries, biological robotics, dynamics and kinematics involved in robot design and uses, on-line robots, robot task planning, rehabilitation robotics, sensory perception, software in the widest sense, particularly in respect of programming languages and links with CAD/CAM systems, telerobotics and various other areas. In addition, interest is focused on various Artificial Intelligence topics of theoretical and practical interest.
期刊最新文献
3D dynamics and control of a snake robot in uncertain underwater environment An application of natural matrices to the dynamic balance problem of planar parallel manipulators Control of stance-leg motion and zero-moment point for achieving perfect upright stationary state of rimless wheel type walker with parallel linkage legs Trajectory tracking control of a mobile robot using fuzzy logic controller with optimal parameters High accuracy hybrid kinematic modeling for serial robotic manipulators
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1