{"title":"基于像素级无碰撞抓取预测网络的医学试管托盘分拣","authors":"Shihao Ge;Beiping Hou;Wen Zhu;Yuzhen Zhu;Senjian Lu;Yangbin Zheng","doi":"10.1109/LRA.2023.3322896","DOIUrl":null,"url":null,"abstract":"Robotic sorting shows a promising aspect for future developments in medical field. However, vision-based grasp detection of medical devices is usually in unstructured or cluttered environments, which raises major challenges for the development of robotic sorting systems. In this letter, a pixel-level grasp detection method is proposed to predict the optimal collision-free grasp configuration on RGB images. First, an Adaptive Grasp Flex Classify (AGFC) model is introduced to add category attributes to distinguish test tube arrangements in complex scenarios. Then, we propose an end-to-end trainable CNN-based architecture, which delivers high quality results for grasp detection and avoids the confusion in neural network learning, to generate the AGFC-model. Utilizing this, we design a Residual Efficient Atrous Spatial Pyramid (REASP) block to further increase the accuracy of grasp detection. Finally, a collision-free manipulation policy is designed to guide the robot to grasp. Experiments on various scenarios are implemented to illustrate the robustness and the effectiveness of our approach, and a robotic grasping platform is constructed to evaluate its application performance. Overall, the developed robotic sorting system achieves a success rate of 95% on test tube sorting.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"8 12","pages":"7897-7904"},"PeriodicalIF":4.6000,"publicationDate":"2023-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Pixel-Level Collision-Free Grasp Prediction Network for Medical Test Tube Sorting on Cluttered Trays\",\"authors\":\"Shihao Ge;Beiping Hou;Wen Zhu;Yuzhen Zhu;Senjian Lu;Yangbin Zheng\",\"doi\":\"10.1109/LRA.2023.3322896\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Robotic sorting shows a promising aspect for future developments in medical field. However, vision-based grasp detection of medical devices is usually in unstructured or cluttered environments, which raises major challenges for the development of robotic sorting systems. In this letter, a pixel-level grasp detection method is proposed to predict the optimal collision-free grasp configuration on RGB images. First, an Adaptive Grasp Flex Classify (AGFC) model is introduced to add category attributes to distinguish test tube arrangements in complex scenarios. Then, we propose an end-to-end trainable CNN-based architecture, which delivers high quality results for grasp detection and avoids the confusion in neural network learning, to generate the AGFC-model. Utilizing this, we design a Residual Efficient Atrous Spatial Pyramid (REASP) block to further increase the accuracy of grasp detection. Finally, a collision-free manipulation policy is designed to guide the robot to grasp. Experiments on various scenarios are implemented to illustrate the robustness and the effectiveness of our approach, and a robotic grasping platform is constructed to evaluate its application performance. Overall, the developed robotic sorting system achieves a success rate of 95% on test tube sorting.\",\"PeriodicalId\":13241,\"journal\":{\"name\":\"IEEE Robotics and Automation Letters\",\"volume\":\"8 12\",\"pages\":\"7897-7904\"},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2023-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Robotics and Automation Letters\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10274123/\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ROBOTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10274123/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
Pixel-Level Collision-Free Grasp Prediction Network for Medical Test Tube Sorting on Cluttered Trays
Robotic sorting shows a promising aspect for future developments in medical field. However, vision-based grasp detection of medical devices is usually in unstructured or cluttered environments, which raises major challenges for the development of robotic sorting systems. In this letter, a pixel-level grasp detection method is proposed to predict the optimal collision-free grasp configuration on RGB images. First, an Adaptive Grasp Flex Classify (AGFC) model is introduced to add category attributes to distinguish test tube arrangements in complex scenarios. Then, we propose an end-to-end trainable CNN-based architecture, which delivers high quality results for grasp detection and avoids the confusion in neural network learning, to generate the AGFC-model. Utilizing this, we design a Residual Efficient Atrous Spatial Pyramid (REASP) block to further increase the accuracy of grasp detection. Finally, a collision-free manipulation policy is designed to guide the robot to grasp. Experiments on various scenarios are implemented to illustrate the robustness and the effectiveness of our approach, and a robotic grasping platform is constructed to evaluate its application performance. Overall, the developed robotic sorting system achieves a success rate of 95% on test tube sorting.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.