{"title":"An EKF Based Approach to Radar Inertial Odometry","authors":"C. Doer, G. Trommer","doi":"10.1109/MFI49285.2020.9235254","DOIUrl":null,"url":null,"abstract":"Accurate localization is key for autonomous robotics. Navigation in GNSS-denied and degraded visual environment is still very challenging. Approaches based on visual sensors usually fail in conditions like darkness, direct sunlight, fog or smoke.Our approach is based on a millimeter wave FMCW radar sensor and an Inertial Measurement Unit (IMU) as both sensors can operate in these conditions. Specifically, we propose an Extended Kalman Filter (EKF) based solution to 3D Radar Inertial Odometry (RIO). A standard automotive FMCW radar which measures the 3D position and Doppler velocity of each detected target is used. Based on the radar measurements, a RANSAC 3D ego velocity estimation is carried out. Fusion with inertial data further improves the accuracy, robustness and provides a high rate motion estimate. An extension with barometric height fusion is presented.The radar based ego velocity estimation is tested in simulation and the accuracy evaluated with real world datasets in a motion capture system. Tests in indoor and outdoor environments with trajectories longer than 200m achieved a final position error below 0.6% of the distance traveled. The proposed odometry approach runs faster than realtime even on an embedded computer.","PeriodicalId":446154,"journal":{"name":"2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"23","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MFI49285.2020.9235254","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 23
Abstract
Accurate localization is key for autonomous robotics. Navigation in GNSS-denied and degraded visual environment is still very challenging. Approaches based on visual sensors usually fail in conditions like darkness, direct sunlight, fog or smoke.Our approach is based on a millimeter wave FMCW radar sensor and an Inertial Measurement Unit (IMU) as both sensors can operate in these conditions. Specifically, we propose an Extended Kalman Filter (EKF) based solution to 3D Radar Inertial Odometry (RIO). A standard automotive FMCW radar which measures the 3D position and Doppler velocity of each detected target is used. Based on the radar measurements, a RANSAC 3D ego velocity estimation is carried out. Fusion with inertial data further improves the accuracy, robustness and provides a high rate motion estimate. An extension with barometric height fusion is presented.The radar based ego velocity estimation is tested in simulation and the accuracy evaluated with real world datasets in a motion capture system. Tests in indoor and outdoor environments with trajectories longer than 200m achieved a final position error below 0.6% of the distance traveled. The proposed odometry approach runs faster than realtime even on an embedded computer.