Simplified autonomous object grasping in material handling process for human–robot collaboration

Muhammad Farouk Setiawan, P. Paryanto, Joga Dharma Setiawan
{"title":"Simplified autonomous object grasping in material handling process for human–robot collaboration","authors":"Muhammad Farouk Setiawan, P. Paryanto, Joga Dharma Setiawan","doi":"10.1007/s41315-024-00375-6","DOIUrl":null,"url":null,"abstract":"<p>The application of Human–Robot Collaboration (HRC) in the manufacturing sector, especially in the material handling process, is aimed at improving productivity through robots actively working alongside humans. In this condition, the robots need to understand how to handle the objects by themselves according to user preferences with an autonomous system. However, there have been challenges in the aspect of teaching robots to autonomously identify object grasp positions only using an RGB camera due to the effect of camera perspective on object visualization for robots. Therefore, this study aimed to propose a simplified method on an RGB camera for autonomous object grasping in the material handling process and implement it for the HRC concept. The method used a prototype robot manipulator with a computer vision system for object detection. During the execution of object grasping, the robot achieved a success rate of 86% for a single object and 76% for multiple objects. In the HRC concept, the robot achieved a success rate of 92% for placing objects one by one and 84% for placing objects continuously. The result also showed fast inference time when the robot in real-time detected the object, which was even just running on the CPU and in the planning process without complexity and requiring additional equipment aside from an RGB camera.</p>","PeriodicalId":44563,"journal":{"name":"International Journal of Intelligent Robotics and Applications","volume":"15 1","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Intelligent Robotics and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s41315-024-00375-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

The application of Human–Robot Collaboration (HRC) in the manufacturing sector, especially in the material handling process, is aimed at improving productivity through robots actively working alongside humans. In this condition, the robots need to understand how to handle the objects by themselves according to user preferences with an autonomous system. However, there have been challenges in the aspect of teaching robots to autonomously identify object grasp positions only using an RGB camera due to the effect of camera perspective on object visualization for robots. Therefore, this study aimed to propose a simplified method on an RGB camera for autonomous object grasping in the material handling process and implement it for the HRC concept. The method used a prototype robot manipulator with a computer vision system for object detection. During the execution of object grasping, the robot achieved a success rate of 86% for a single object and 76% for multiple objects. In the HRC concept, the robot achieved a success rate of 92% for placing objects one by one and 84% for placing objects continuously. The result also showed fast inference time when the robot in real-time detected the object, which was even just running on the CPU and in the planning process without complexity and requiring additional equipment aside from an RGB camera.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
简化物料搬运过程中的自主物体抓取,实现人机协作
人机协作(HRC)在制造业中的应用,尤其是在材料处理过程中的应用,旨在通过机器人与人类一起积极工作来提高生产率。在这种情况下,机器人需要了解如何通过自主系统根据用户偏好自行处理物品。然而,由于摄像头视角对机器人物体可视化的影响,仅使用 RGB 摄像头教机器人自主识别物体抓取位置一直是个难题。因此,本研究旨在提出一种在材料处理过程中使用 RGB 摄像机自主抓取物体的简化方法,并将其应用于 HRC 概念。该方法使用带有计算机视觉系统的原型机器人机械手进行物体检测。在物体抓取过程中,机器人抓取单个物体的成功率为 86%,抓取多个物体的成功率为 76%。在 HRC 概念中,机器人逐个放置物体的成功率为 92%,连续放置物体的成功率为 84%。结果还显示,当机器人实时检测到物体时,推理时间很快,甚至只需在中央处理器上运行,在规划过程中,除了一个 RGB 摄像头外,无需复杂的额外设备。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
3.80
自引率
5.90%
发文量
50
期刊介绍: The International Journal of Intelligent Robotics and Applications (IJIRA) fosters the dissemination of new discoveries and novel technologies that advance developments in robotics and their broad applications. This journal provides a publication and communication platform for all robotics topics, from the theoretical fundamentals and technological advances to various applications including manufacturing, space vehicles, biomedical systems and automobiles, data-storage devices, healthcare systems, home appliances, and intelligent highways. IJIRA welcomes contributions from researchers, professionals and industrial practitioners. It publishes original, high-quality and previously unpublished research papers, brief reports, and critical reviews. Specific areas of interest include, but are not limited to:Advanced actuators and sensorsCollective and social robots Computing, communication and controlDesign, modeling and prototypingHuman and robot interactionMachine learning and intelligenceMobile robots and intelligent autonomous systemsMulti-sensor fusion and perceptionPlanning, navigation and localizationRobot intelligence, learning and linguisticsRobotic vision, recognition and reconstructionBio-mechatronics and roboticsCloud and Swarm roboticsCognitive and neuro roboticsExploration and security roboticsHealthcare, medical and assistive roboticsRobotics for intelligent manufacturingService, social and entertainment roboticsSpace and underwater robotsNovel and emerging applications
期刊最新文献
A review of the application of fuzzy mathematical algorithm-based approach in autonomous vehicles and drones Robotic tree climbers and strategies - a survey Efficient multi-robot path planning in real environments: a centralized coordination system Cross-pollination of knowledge for object detection in domain adaptation for industrial automation Push or pull: grasping performance analysis between a pulling gripper inspired by Tetraonchus monenteron parasite versus an actively pushing gripper developed through many-objective design optimization
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1