{"title":"GV-iRIOM: GNSS-visual-aided 4D radar inertial odometry and mapping in large-scale environments","authors":"Binliang Wang , Yuan Zhuang , Jianzhu Huai , Yiwen Chen , Jiagang Chen , Nashwa El-Bendary","doi":"10.1016/j.isprsjprs.2025.01.039","DOIUrl":null,"url":null,"abstract":"<div><div>Accurate state estimation is crucial for autonomous navigation in unmanned systems. While traditional visual and lidar systems struggle in adverse conditions such as rain, fog, or smoke, millimeter-wave radar provides robust all-weather localization and mapping capabilities. However, sparse and noisy radar point clouds often compromise localization accuracy and lead to odometry immanent drift. This paper presents GV-iRIOM, a novel millimeter-wave radar localization and mapping system that utilizes a two layer estimation framework, which simultaneously integrates visual, inertial, and GNSS data to improve localization accuracy. The system employs radar inertial odometry and visual inertial odometry as the SLAM front-end. Addressing the varying observation accuracy of 3-axis motion for different azimuth/vertical angles in 4D radar data, we propose an angle-adaptive weighted robust estimation method for radar ego-velocity estimation. Furthermore, we developed a back-end for multi-source information fusion, integrating odometry pose constraints, GNSS observations, and loop closure constraints to ensure globally consistent positioning and mapping. By dynamically initializing GNSS measurements through observability analysis, our system automatically achieves positioning and mapping based on an absolute geographic coordinate framework, and facilitates multi-phase map fusion and multi-robot positioning. Experiments conducted on both in-house data and publicly available datasets validate the system’s robustness and effectiveness. In large-scale scenarios, the absolute localization accuracy is improved by more than 50%, ensuring globally consistent mapping across a variety of challenging environments.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"221 ","pages":"Pages 310-323"},"PeriodicalIF":10.6000,"publicationDate":"2025-02-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625000449","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate state estimation is crucial for autonomous navigation in unmanned systems. While traditional visual and lidar systems struggle in adverse conditions such as rain, fog, or smoke, millimeter-wave radar provides robust all-weather localization and mapping capabilities. However, sparse and noisy radar point clouds often compromise localization accuracy and lead to odometry immanent drift. This paper presents GV-iRIOM, a novel millimeter-wave radar localization and mapping system that utilizes a two layer estimation framework, which simultaneously integrates visual, inertial, and GNSS data to improve localization accuracy. The system employs radar inertial odometry and visual inertial odometry as the SLAM front-end. Addressing the varying observation accuracy of 3-axis motion for different azimuth/vertical angles in 4D radar data, we propose an angle-adaptive weighted robust estimation method for radar ego-velocity estimation. Furthermore, we developed a back-end for multi-source information fusion, integrating odometry pose constraints, GNSS observations, and loop closure constraints to ensure globally consistent positioning and mapping. By dynamically initializing GNSS measurements through observability analysis, our system automatically achieves positioning and mapping based on an absolute geographic coordinate framework, and facilitates multi-phase map fusion and multi-robot positioning. Experiments conducted on both in-house data and publicly available datasets validate the system’s robustness and effectiveness. In large-scale scenarios, the absolute localization accuracy is improved by more than 50%, ensuring globally consistent mapping across a variety of challenging environments.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.