{"title":"Chained fusion of discrete and continuous epipolar geometry with odometry for long-term localization of mobile robots","authors":"David Tick, Jinglin Shen, Yinghua Zhang, N. Gans","doi":"10.1109/CCA.2011.6044426","DOIUrl":null,"url":null,"abstract":"This paper presents a sensor fusion implementation to improve the accuracy of robot localization by combining multiple visual odometry approaches with wheel and IMU odometry. Discrete and continuous Homography Matrices are used to recover robot pose and velocity from image sequences of tracked feature points. The camera's limited field of view is addressed by chaining vision-based motion estimates. As feature points leave the field of view, new features are acquired and tracked. When a new set of points is needed, the motion estimate is reinitialized and chained to the previous state estimate. An extended Kalman filter fuses measurements from the robot's wheel encoders with those from visual and inertial measurement systems. Time varying matrices in the extended Kalman filter compensate for known changes in sensor accuracy, including periods when visual features cannot be reliably tracked. Experiments are performed to validate the approach.","PeriodicalId":208713,"journal":{"name":"2011 IEEE International Conference on Control Applications (CCA)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE International Conference on Control Applications (CCA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCA.2011.6044426","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
This paper presents a sensor fusion implementation to improve the accuracy of robot localization by combining multiple visual odometry approaches with wheel and IMU odometry. Discrete and continuous Homography Matrices are used to recover robot pose and velocity from image sequences of tracked feature points. The camera's limited field of view is addressed by chaining vision-based motion estimates. As feature points leave the field of view, new features are acquired and tracked. When a new set of points is needed, the motion estimate is reinitialized and chained to the previous state estimate. An extended Kalman filter fuses measurements from the robot's wheel encoders with those from visual and inertial measurement systems. Time varying matrices in the extended Kalman filter compensate for known changes in sensor accuracy, including periods when visual features cannot be reliably tracked. Experiments are performed to validate the approach.