Simin Zhu;Satish Ravindran;Lihui Chen;Alexander G. Yarovoy;Francesco Fioranelli
{"title":"DeepEgo+: Unsynchronized Radar Sensor Fusion for Robust Vehicle Ego-Motion Estimation","authors":"Simin Zhu;Satish Ravindran;Lihui Chen;Alexander G. Yarovoy;Francesco Fioranelli","doi":"10.1109/TRS.2025.3546001","DOIUrl":null,"url":null,"abstract":"This article studies the problem of estimating the 2-D motion state of a moving vehicle (ego motion) using millimeter-wave (mmWave) automotive radar sensors. Unlike prior single-radar or synchronized radar systems, the proposed approach (named DeepEgo+) can achieve sensor fusion and estimate ego motion using an unsynchronized radar sensor network. To achieve this goal, DeepEgo+ combines two neural network (NN)-based components (i.e., Module A for motion estimation and Module B for sensor fusion) with a decentralized processing architecture using the late fusion technique. Specifically, each radar sensor in the network has a Module A that processes its output and computes an initial motion estimate, while Module B fuses the initial estimates from all radar sensors and outputs the final estimate. This novel architecture and fusion scheme not only eliminates the synchronization requirement but also provides robustness and scalability to the system. To benchmark its performance, DeepEgo+ has been tested using a challenging real-world radar dataset, RadarScenes. The results show that DeepEgo+ provides significant performance advantages over recent state-of-the-art approaches in terms of estimation accuracy, long-term stability, and robustness against high outlier ratios and sensor failures. Furthermore, the influence of vehicle nonzero acceleration on ego-motion estimation is identified for the first time, and DeepEgo+ demonstrates the feasibility of compensating for its effect and further improving the estimation accuracy.","PeriodicalId":100645,"journal":{"name":"IEEE Transactions on Radar Systems","volume":"3 ","pages":"483-497"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Radar Systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10904906/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This article studies the problem of estimating the 2-D motion state of a moving vehicle (ego motion) using millimeter-wave (mmWave) automotive radar sensors. Unlike prior single-radar or synchronized radar systems, the proposed approach (named DeepEgo+) can achieve sensor fusion and estimate ego motion using an unsynchronized radar sensor network. To achieve this goal, DeepEgo+ combines two neural network (NN)-based components (i.e., Module A for motion estimation and Module B for sensor fusion) with a decentralized processing architecture using the late fusion technique. Specifically, each radar sensor in the network has a Module A that processes its output and computes an initial motion estimate, while Module B fuses the initial estimates from all radar sensors and outputs the final estimate. This novel architecture and fusion scheme not only eliminates the synchronization requirement but also provides robustness and scalability to the system. To benchmark its performance, DeepEgo+ has been tested using a challenging real-world radar dataset, RadarScenes. The results show that DeepEgo+ provides significant performance advantages over recent state-of-the-art approaches in terms of estimation accuracy, long-term stability, and robustness against high outlier ratios and sensor failures. Furthermore, the influence of vehicle nonzero acceleration on ego-motion estimation is identified for the first time, and DeepEgo+ demonstrates the feasibility of compensating for its effect and further improving the estimation accuracy.