通过基于学习的 3D 视觉技术实现机器人手眼自动校准

IF 3.1 4区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Journal of Intelligent & Robotic Systems Pub Date : 2024-09-05 DOI:10.1007/s10846-024-02166-4
Leihui Li, Xingyu Yang, Riwei Wang, Xuping Zhang
{"title":"通过基于学习的 3D 视觉技术实现机器人手眼自动校准","authors":"Leihui Li, Xingyu Yang, Riwei Wang, Xuping Zhang","doi":"10.1007/s10846-024-02166-4","DOIUrl":null,"url":null,"abstract":"<p>Hand-eye calibration, a fundamental task in vision-based robotic systems, is commonly equipped with collaborative robots, especially for robotic applications in small and medium-sized enterprises (SMEs). Most approaches to hand-eye calibration rely on external markers or human assistance. We proposed a novel methodology that addresses the hand-eye calibration problem using the robot base as a reference, eliminating the need for external calibration objects or human intervention. Using point clouds of the robot base, a transformation matrix from the coordinate frame of the camera to the robot base is established as “<b>I</b>=<b>AXB</b>.” To this end, we exploit learning-based 3D detection and registration algorithms to estimate the location and orientation of the robot base. The robustness and accuracy of the method are quantified by ground-truth-based evaluation, and the accuracy result is compared with other 3D vision-based calibration methods. To assess the feasibility of our methodology, we carried out experiments utilizing a low-cost structured light scanner across varying joint configurations and groups of experiments. The proposed hand-eye calibration method achieved a translation deviation of 0.930 mm and a rotation deviation of 0.265 degrees according to the experimental results. Additionally, the 3D reconstruction experiments demonstrated a rotation error of 0.994 degrees and a position error of 1.697 mm. Moreover, our method offers the potential to be completed in 1 second, which is the fastest compared to other 3D hand-eye calibration methods. We conduct indoor 3D reconstruction and robotic grasping experiments based on our hand-eye calibration method. Related code is released at https://github.com/leihui6/LRBO.</p>","PeriodicalId":54794,"journal":{"name":"Journal of Intelligent & Robotic Systems","volume":"35 1","pages":""},"PeriodicalIF":3.1000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Automatic Robot Hand-Eye Calibration Enabled by Learning-Based 3D Vision\",\"authors\":\"Leihui Li, Xingyu Yang, Riwei Wang, Xuping Zhang\",\"doi\":\"10.1007/s10846-024-02166-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Hand-eye calibration, a fundamental task in vision-based robotic systems, is commonly equipped with collaborative robots, especially for robotic applications in small and medium-sized enterprises (SMEs). Most approaches to hand-eye calibration rely on external markers or human assistance. We proposed a novel methodology that addresses the hand-eye calibration problem using the robot base as a reference, eliminating the need for external calibration objects or human intervention. Using point clouds of the robot base, a transformation matrix from the coordinate frame of the camera to the robot base is established as “<b>I</b>=<b>AXB</b>.” To this end, we exploit learning-based 3D detection and registration algorithms to estimate the location and orientation of the robot base. The robustness and accuracy of the method are quantified by ground-truth-based evaluation, and the accuracy result is compared with other 3D vision-based calibration methods. To assess the feasibility of our methodology, we carried out experiments utilizing a low-cost structured light scanner across varying joint configurations and groups of experiments. The proposed hand-eye calibration method achieved a translation deviation of 0.930 mm and a rotation deviation of 0.265 degrees according to the experimental results. Additionally, the 3D reconstruction experiments demonstrated a rotation error of 0.994 degrees and a position error of 1.697 mm. Moreover, our method offers the potential to be completed in 1 second, which is the fastest compared to other 3D hand-eye calibration methods. We conduct indoor 3D reconstruction and robotic grasping experiments based on our hand-eye calibration method. Related code is released at https://github.com/leihui6/LRBO.</p>\",\"PeriodicalId\":54794,\"journal\":{\"name\":\"Journal of Intelligent & Robotic Systems\",\"volume\":\"35 1\",\"pages\":\"\"},\"PeriodicalIF\":3.1000,\"publicationDate\":\"2024-09-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Intelligent & Robotic Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s10846-024-02166-4\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Intelligent & Robotic Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10846-024-02166-4","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

手眼校准是基于视觉的机器人系统的一项基本任务,通常配备在协作机器人上,尤其是中小型企业(SMEs)的机器人应用。大多数手眼校准方法都依赖于外部标记或人工辅助。我们提出了一种新颖的方法,以机器人底座为基准解决手眼校准问题,无需外部校准对象或人工干预。利用机器人底座的点云,建立了一个从摄像机坐标系到机器人底座的变换矩阵,即 "I=AXB"。为此,我们利用基于学习的 3D 检测和注册算法来估计机器人底座的位置和方向。通过基于地面实况的评估,对该方法的稳健性和准确性进行了量化,并将准确性结果与其他基于三维视觉的校准方法进行了比较。为了评估方法的可行性,我们利用低成本的结构光扫描仪在不同的关节配置和实验组中进行了实验。实验结果表明,所提出的手眼校准方法的平移偏差为 0.930 毫米,旋转偏差为 0.265 度。此外,三维重建实验表明,旋转误差为 0.994 度,位置误差为 1.697 毫米。此外,我们的方法可以在 1 秒钟内完成,与其他三维手眼校准方法相比是最快的。我们基于手眼校准方法进行了室内三维重建和机器人抓取实验。相关代码发布于 https://github.com/leihui6/LRBO。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Automatic Robot Hand-Eye Calibration Enabled by Learning-Based 3D Vision

Hand-eye calibration, a fundamental task in vision-based robotic systems, is commonly equipped with collaborative robots, especially for robotic applications in small and medium-sized enterprises (SMEs). Most approaches to hand-eye calibration rely on external markers or human assistance. We proposed a novel methodology that addresses the hand-eye calibration problem using the robot base as a reference, eliminating the need for external calibration objects or human intervention. Using point clouds of the robot base, a transformation matrix from the coordinate frame of the camera to the robot base is established as “I=AXB.” To this end, we exploit learning-based 3D detection and registration algorithms to estimate the location and orientation of the robot base. The robustness and accuracy of the method are quantified by ground-truth-based evaluation, and the accuracy result is compared with other 3D vision-based calibration methods. To assess the feasibility of our methodology, we carried out experiments utilizing a low-cost structured light scanner across varying joint configurations and groups of experiments. The proposed hand-eye calibration method achieved a translation deviation of 0.930 mm and a rotation deviation of 0.265 degrees according to the experimental results. Additionally, the 3D reconstruction experiments demonstrated a rotation error of 0.994 degrees and a position error of 1.697 mm. Moreover, our method offers the potential to be completed in 1 second, which is the fastest compared to other 3D hand-eye calibration methods. We conduct indoor 3D reconstruction and robotic grasping experiments based on our hand-eye calibration method. Related code is released at https://github.com/leihui6/LRBO.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Intelligent & Robotic Systems
Journal of Intelligent & Robotic Systems 工程技术-机器人学
CiteScore
7.00
自引率
9.10%
发文量
219
审稿时长
6 months
期刊介绍: The Journal of Intelligent and Robotic Systems bridges the gap between theory and practice in all areas of intelligent systems and robotics. It publishes original, peer reviewed contributions from initial concept and theory to prototyping to final product development and commercialization. On the theoretical side, the journal features papers focusing on intelligent systems engineering, distributed intelligence systems, multi-level systems, intelligent control, multi-robot systems, cooperation and coordination of unmanned vehicle systems, etc. On the application side, the journal emphasizes autonomous systems, industrial robotic systems, multi-robot systems, aerial vehicles, mobile robot platforms, underwater robots, sensors, sensor-fusion, and sensor-based control. Readers will also find papers on real applications of intelligent and robotic systems (e.g., mechatronics, manufacturing, biomedical, underwater, humanoid, mobile/legged robot and space applications, etc.).
期刊最新文献
UAV Routing for Enhancing the Performance of a Classifier-in-the-loop DFT-VSLAM: A Dynamic Optical Flow Tracking VSLAM Method Design and Development of a Robust Control Platform for a 3-Finger Robotic Gripper Using EMG-Derived Hand Muscle Signals in NI LabVIEW Neural Network-based Adaptive Finite-time Control for 2-DOF Helicopter Systems with Prescribed Performance and Input Saturation Six-Degree-of-Freedom Pose Estimation Method for Multi-Source Feature Points Based on Fully Convolutional Neural Network
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1