Vlaho-Josip Štironja , Juraj Peršić , Luka Petrović , Ivan Marković , Ivan Petrović
{"title":"MOVRO2: Loosely coupled monocular visual radar odometry using factor graph optimization","authors":"Vlaho-Josip Štironja , Juraj Peršić , Luka Petrović , Ivan Marković , Ivan Petrović","doi":"10.1016/j.robot.2024.104860","DOIUrl":null,"url":null,"abstract":"<div><div>Ego-motion estimation is an indispensable part of any autonomous system, especially in scenarios where wheel odometry or global pose measurement is unreliable or unavailable. In an environment where a global navigation satellite system is not available, conventional solutions for ego-motion estimation rely on the fusion of a LiDAR, a monocular camera and an inertial measurement unit (IMU), which is often plagued by drift. Therefore, complementary sensor solutions are being explored instead of relying on expensive and powerful IMUs. In this paper, we propose a method for estimating ego-motion, which we call MOVRO2, that utilizes the complementarity of radar and camera data. It is based on a loosely coupled monocular visual radar odometry approach within a factor graph optimization framework. The adoption of a loosely coupled approach is motivated by its scalability and the possibility to develop sensor models independently. To estimate the motion within the proposed framework, we fuse ego-velocity of the radar and scan-to-scan matches with the rotation obtained from consecutive camera frames and the unscaled velocity of the monocular odometry. We evaluate the performance of the proposed method on two open-source datasets and compare it to various mono-, dual- and three-sensor solutions, where our cost-effective method demonstrates performance comparable to state-of-the-art visual-inertial radar and LiDAR odometry solutions using high-performance 64-line LiDARs.</div></div>","PeriodicalId":49592,"journal":{"name":"Robotics and Autonomous Systems","volume":"184 ","pages":"Article 104860"},"PeriodicalIF":4.3000,"publicationDate":"2024-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Autonomous Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0921889024002446","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Ego-motion estimation is an indispensable part of any autonomous system, especially in scenarios where wheel odometry or global pose measurement is unreliable or unavailable. In an environment where a global navigation satellite system is not available, conventional solutions for ego-motion estimation rely on the fusion of a LiDAR, a monocular camera and an inertial measurement unit (IMU), which is often plagued by drift. Therefore, complementary sensor solutions are being explored instead of relying on expensive and powerful IMUs. In this paper, we propose a method for estimating ego-motion, which we call MOVRO2, that utilizes the complementarity of radar and camera data. It is based on a loosely coupled monocular visual radar odometry approach within a factor graph optimization framework. The adoption of a loosely coupled approach is motivated by its scalability and the possibility to develop sensor models independently. To estimate the motion within the proposed framework, we fuse ego-velocity of the radar and scan-to-scan matches with the rotation obtained from consecutive camera frames and the unscaled velocity of the monocular odometry. We evaluate the performance of the proposed method on two open-source datasets and compare it to various mono-, dual- and three-sensor solutions, where our cost-effective method demonstrates performance comparable to state-of-the-art visual-inertial radar and LiDAR odometry solutions using high-performance 64-line LiDARs.
期刊介绍:
Robotics and Autonomous Systems will carry articles describing fundamental developments in the field of robotics, with special emphasis on autonomous systems. An important goal of this journal is to extend the state of the art in both symbolic and sensory based robot control and learning in the context of autonomous systems.
Robotics and Autonomous Systems will carry articles on the theoretical, computational and experimental aspects of autonomous systems, or modules of such systems.