Muhammad Farouk Setiawan, P. Paryanto, Joga Dharma Setiawan
{"title":"Simplified autonomous object grasping in material handling process for human–robot collaboration","authors":"Muhammad Farouk Setiawan, P. Paryanto, Joga Dharma Setiawan","doi":"10.1007/s41315-024-00375-6","DOIUrl":null,"url":null,"abstract":"<p>The application of Human–Robot Collaboration (HRC) in the manufacturing sector, especially in the material handling process, is aimed at improving productivity through robots actively working alongside humans. In this condition, the robots need to understand how to handle the objects by themselves according to user preferences with an autonomous system. However, there have been challenges in the aspect of teaching robots to autonomously identify object grasp positions only using an RGB camera due to the effect of camera perspective on object visualization for robots. Therefore, this study aimed to propose a simplified method on an RGB camera for autonomous object grasping in the material handling process and implement it for the HRC concept. The method used a prototype robot manipulator with a computer vision system for object detection. During the execution of object grasping, the robot achieved a success rate of 86% for a single object and 76% for multiple objects. In the HRC concept, the robot achieved a success rate of 92% for placing objects one by one and 84% for placing objects continuously. The result also showed fast inference time when the robot in real-time detected the object, which was even just running on the CPU and in the planning process without complexity and requiring additional equipment aside from an RGB camera.</p>","PeriodicalId":44563,"journal":{"name":"International Journal of Intelligent Robotics and Applications","volume":"15 1","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Intelligent Robotics and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s41315-024-00375-6","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
The application of Human–Robot Collaboration (HRC) in the manufacturing sector, especially in the material handling process, is aimed at improving productivity through robots actively working alongside humans. In this condition, the robots need to understand how to handle the objects by themselves according to user preferences with an autonomous system. However, there have been challenges in the aspect of teaching robots to autonomously identify object grasp positions only using an RGB camera due to the effect of camera perspective on object visualization for robots. Therefore, this study aimed to propose a simplified method on an RGB camera for autonomous object grasping in the material handling process and implement it for the HRC concept. The method used a prototype robot manipulator with a computer vision system for object detection. During the execution of object grasping, the robot achieved a success rate of 86% for a single object and 76% for multiple objects. In the HRC concept, the robot achieved a success rate of 92% for placing objects one by one and 84% for placing objects continuously. The result also showed fast inference time when the robot in real-time detected the object, which was even just running on the CPU and in the planning process without complexity and requiring additional equipment aside from an RGB camera.
期刊介绍:
The International Journal of Intelligent Robotics and Applications (IJIRA) fosters the dissemination of new discoveries and novel technologies that advance developments in robotics and their broad applications. This journal provides a publication and communication platform for all robotics topics, from the theoretical fundamentals and technological advances to various applications including manufacturing, space vehicles, biomedical systems and automobiles, data-storage devices, healthcare systems, home appliances, and intelligent highways. IJIRA welcomes contributions from researchers, professionals and industrial practitioners. It publishes original, high-quality and previously unpublished research papers, brief reports, and critical reviews. Specific areas of interest include, but are not limited to:Advanced actuators and sensorsCollective and social robots Computing, communication and controlDesign, modeling and prototypingHuman and robot interactionMachine learning and intelligenceMobile robots and intelligent autonomous systemsMulti-sensor fusion and perceptionPlanning, navigation and localizationRobot intelligence, learning and linguisticsRobotic vision, recognition and reconstructionBio-mechatronics and roboticsCloud and Swarm roboticsCognitive and neuro roboticsExploration and security roboticsHealthcare, medical and assistive roboticsRobotics for intelligent manufacturingService, social and entertainment roboticsSpace and underwater robotsNovel and emerging applications