{"title":"基于传感器语义信息的相机-激光雷达在线标定","authors":"Yufeng Zhu, Chenghui Li, Yubo Zhang","doi":"10.1109/ICRA40945.2020.9196627","DOIUrl":null,"url":null,"abstract":"As a crucial step of sensor data fusion, sensor calibration plays a vital role in many cutting-edge machine vision applications, such as autonomous vehicles and AR/VR. Existing techniques either require quite amount of manual work and complex settings, or are unrobust and prone to produce suboptimal results. In this paper, we investigate the extrinsic calibration of an RGB camera and a light detection and ranging (LiDAR) sensor, which are two of the most widely used sensors in autonomous vehicles for perceiving the outdoor environment. Specifically, we introduce an online calibration technique that automatically computes the optimal rigid motion transformation between the aforementioned two sensors and maximizes their mutual information of perceived data, without the need of tuning environment settings. By formulating the calibration as an optimization problem with a novel calibration quality metric based on semantic features, we successfully and robustly align pairs of temporally synchronized camera and LiDAR frames in real time. Demonstrated on several autonomous driving tasks, our method outperforms state-of-the-art edge feature based auto-calibration approaches in terms of robustness and accuracy.","PeriodicalId":6859,"journal":{"name":"2020 IEEE International Conference on Robotics and Automation (ICRA)","volume":"13 1","pages":"4970-4976"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"42","resultStr":"{\"title\":\"Online Camera-LiDAR Calibration with Sensor Semantic Information\",\"authors\":\"Yufeng Zhu, Chenghui Li, Yubo Zhang\",\"doi\":\"10.1109/ICRA40945.2020.9196627\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As a crucial step of sensor data fusion, sensor calibration plays a vital role in many cutting-edge machine vision applications, such as autonomous vehicles and AR/VR. Existing techniques either require quite amount of manual work and complex settings, or are unrobust and prone to produce suboptimal results. In this paper, we investigate the extrinsic calibration of an RGB camera and a light detection and ranging (LiDAR) sensor, which are two of the most widely used sensors in autonomous vehicles for perceiving the outdoor environment. Specifically, we introduce an online calibration technique that automatically computes the optimal rigid motion transformation between the aforementioned two sensors and maximizes their mutual information of perceived data, without the need of tuning environment settings. By formulating the calibration as an optimization problem with a novel calibration quality metric based on semantic features, we successfully and robustly align pairs of temporally synchronized camera and LiDAR frames in real time. Demonstrated on several autonomous driving tasks, our method outperforms state-of-the-art edge feature based auto-calibration approaches in terms of robustness and accuracy.\",\"PeriodicalId\":6859,\"journal\":{\"name\":\"2020 IEEE International Conference on Robotics and Automation (ICRA)\",\"volume\":\"13 1\",\"pages\":\"4970-4976\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"42\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Conference on Robotics and Automation (ICRA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICRA40945.2020.9196627\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Robotics and Automation (ICRA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICRA40945.2020.9196627","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Online Camera-LiDAR Calibration with Sensor Semantic Information
As a crucial step of sensor data fusion, sensor calibration plays a vital role in many cutting-edge machine vision applications, such as autonomous vehicles and AR/VR. Existing techniques either require quite amount of manual work and complex settings, or are unrobust and prone to produce suboptimal results. In this paper, we investigate the extrinsic calibration of an RGB camera and a light detection and ranging (LiDAR) sensor, which are two of the most widely used sensors in autonomous vehicles for perceiving the outdoor environment. Specifically, we introduce an online calibration technique that automatically computes the optimal rigid motion transformation between the aforementioned two sensors and maximizes their mutual information of perceived data, without the need of tuning environment settings. By formulating the calibration as an optimization problem with a novel calibration quality metric based on semantic features, we successfully and robustly align pairs of temporally synchronized camera and LiDAR frames in real time. Demonstrated on several autonomous driving tasks, our method outperforms state-of-the-art edge feature based auto-calibration approaches in terms of robustness and accuracy.