Markus Ulrich , Carsten Steger , Florian Butsch , Maurice Liebe
{"title":"Vision-guided robot calibration using photogrammetric methods","authors":"Markus Ulrich , Carsten Steger , Florian Butsch , Maurice Liebe","doi":"10.1016/j.isprsjprs.2024.09.037","DOIUrl":null,"url":null,"abstract":"<div><div>We propose novel photogrammetry-based robot calibration methods for industrial robots that are guided by cameras or 3D sensors. Compared to state-of-the-art methods, our methods are capable of calibrating the robot kinematics, the hand–eye transformations, and, for camera-guided robots, the interior orientation of the camera simultaneously. Our approach uses a minimal parameterization of the robot kinematics and hand–eye transformations. Furthermore, it uses a camera model that is capable of handling a large range of complex lens distortions that can occur in cameras that are typically used in machine vision applications. To determine the model parameters, geometrically meaningful photogrammetric error measures are used. They are independent of the parameterization of the model and typically result in a higher accuracy. We apply a stochastic model for all parameters (observations and unknowns), which allows us to assess the precision and significance of the calibrated model parameters. To evaluate our methods, we propose novel procedures that are relevant in real-world applications and do not require ground truth values. Experiments on synthetic and real data show that our approach improves the absolute positioning accuracy of industrial robots significantly. By applying our approach to two different uncalibrated UR3e robots, one guided by a camera and one by a 3D sensor, we were able to reduce the RMS evaluation error by approximately 85% for each robot.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"218 ","pages":"Pages 645-662"},"PeriodicalIF":10.6000,"publicationDate":"2024-10-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271624003757","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
We propose novel photogrammetry-based robot calibration methods for industrial robots that are guided by cameras or 3D sensors. Compared to state-of-the-art methods, our methods are capable of calibrating the robot kinematics, the hand–eye transformations, and, for camera-guided robots, the interior orientation of the camera simultaneously. Our approach uses a minimal parameterization of the robot kinematics and hand–eye transformations. Furthermore, it uses a camera model that is capable of handling a large range of complex lens distortions that can occur in cameras that are typically used in machine vision applications. To determine the model parameters, geometrically meaningful photogrammetric error measures are used. They are independent of the parameterization of the model and typically result in a higher accuracy. We apply a stochastic model for all parameters (observations and unknowns), which allows us to assess the precision and significance of the calibrated model parameters. To evaluate our methods, we propose novel procedures that are relevant in real-world applications and do not require ground truth values. Experiments on synthetic and real data show that our approach improves the absolute positioning accuracy of industrial robots significantly. By applying our approach to two different uncalibrated UR3e robots, one guided by a camera and one by a 3D sensor, we were able to reduce the RMS evaluation error by approximately 85% for each robot.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.