{"title":"An Efficient Approach for Calibration of Automotive Radar–Camera With Real-Time Projection of Multimodal Data","authors":"Nitish Kumar;Ayush Dasgupta;Venkata Satyanand Mutnuri;Rajalakshmi Pachamuthu","doi":"10.1109/TRS.2024.3408231","DOIUrl":null,"url":null,"abstract":"This article presents a comprehensive method for radar-camera calibration with a primary focus on real-time projection, addressing the critical need for precise spatial and temporal alignment between radar and camera sensor modalities. The research introduces a novel methodology for calibration utilizing geometrical transformation, incorporating radar corner reflectors to establish correspondences. This methodology applies to post-automotive manufacturing for integration into radar-camera applications such as advanced driver-assistance systems (ADASs), adaptive cruise control (ACC), collision warning, and mitigation systems. It also serves post-production for sensor installation and algorithm development. The proposed approach employs an advanced algorithm to optimize spatial and temporal synchronization and radar and camera data alignment, ensuring accuracy in multimodal sensor fusion. Rigorous validation through extensive testing demonstrates the efficiency and reliability of the proposed system. The results show that the calibration method is highly accurate compared to the existing state-of-the-art methods, with minimal errors, an average Euclidean distance (AED) of 1.447, and a root-mean-square reprojection error (RMSRE) of (0.1720, 0.5965), indicating a highly efficient spatial synchronization method. During real-time projection, the proposed algorithm for temporal synchronization achieves an average latency of 35 ms between frames.","PeriodicalId":100645,"journal":{"name":"IEEE Transactions on Radar Systems","volume":"2 ","pages":"573-582"},"PeriodicalIF":0.0000,"publicationDate":"2024-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Radar Systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10546272/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This article presents a comprehensive method for radar-camera calibration with a primary focus on real-time projection, addressing the critical need for precise spatial and temporal alignment between radar and camera sensor modalities. The research introduces a novel methodology for calibration utilizing geometrical transformation, incorporating radar corner reflectors to establish correspondences. This methodology applies to post-automotive manufacturing for integration into radar-camera applications such as advanced driver-assistance systems (ADASs), adaptive cruise control (ACC), collision warning, and mitigation systems. It also serves post-production for sensor installation and algorithm development. The proposed approach employs an advanced algorithm to optimize spatial and temporal synchronization and radar and camera data alignment, ensuring accuracy in multimodal sensor fusion. Rigorous validation through extensive testing demonstrates the efficiency and reliability of the proposed system. The results show that the calibration method is highly accurate compared to the existing state-of-the-art methods, with minimal errors, an average Euclidean distance (AED) of 1.447, and a root-mean-square reprojection error (RMSRE) of (0.1720, 0.5965), indicating a highly efficient spatial synchronization method. During real-time projection, the proposed algorithm for temporal synchronization achieves an average latency of 35 ms between frames.