{"title":"基于单目相机和IMU的增强现实紧密耦合鲁棒视觉辅助惯性导航算法","authors":"T. Oskiper, S. Samarasekera, Rakesh Kumar","doi":"10.1109/ISMAR.2011.6143485","DOIUrl":null,"url":null,"abstract":"Odometry component of a camera tracking system for augmented reality applications is described. The system uses a MEMS-type inertial measurement unit (IMU) with 3-axis gyroscopes and accelerometers and a monocular camera to accurately and robustly track the camera motion in 6 degrees of freedom (with correct scale) in arbitrary indoor or outdoor scenes. Tight coupling of IMU and camera is achieved by an error-state extended Kalman filter (EKF) which performs sensor fusion for inertial navigation at a deep level such that each visually tracked feature contributes as an individual measurement as opposed to the more traditional approaches where camera pose estimates are first extracted by means of feature tracking and then used as measurement updates in a filter framework. Robustness, on the other hand, is achieved by using a geometric hypothesize-and-test architecture based on the five-point relative pose estimation method, rather than a Mahalanobis distance type gating mechanism derived from the Kalman filter state prediction, to select the inlier tracks and remove outliers from the raw feature point matches which would otherwise corrupt the filter since tracks are directly used as measurements.","PeriodicalId":298757,"journal":{"name":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Tightly-coupled robust vision aided inertial navigation algorithm for augmented reality using monocular camera and IMU\",\"authors\":\"T. Oskiper, S. Samarasekera, Rakesh Kumar\",\"doi\":\"10.1109/ISMAR.2011.6143485\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Odometry component of a camera tracking system for augmented reality applications is described. The system uses a MEMS-type inertial measurement unit (IMU) with 3-axis gyroscopes and accelerometers and a monocular camera to accurately and robustly track the camera motion in 6 degrees of freedom (with correct scale) in arbitrary indoor or outdoor scenes. Tight coupling of IMU and camera is achieved by an error-state extended Kalman filter (EKF) which performs sensor fusion for inertial navigation at a deep level such that each visually tracked feature contributes as an individual measurement as opposed to the more traditional approaches where camera pose estimates are first extracted by means of feature tracking and then used as measurement updates in a filter framework. Robustness, on the other hand, is achieved by using a geometric hypothesize-and-test architecture based on the five-point relative pose estimation method, rather than a Mahalanobis distance type gating mechanism derived from the Kalman filter state prediction, to select the inlier tracks and remove outliers from the raw feature point matches which would otherwise corrupt the filter since tracks are directly used as measurements.\",\"PeriodicalId\":298757,\"journal\":{\"name\":\"2011 10th IEEE International Symposium on Mixed and Augmented Reality\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 10th IEEE International Symposium on Mixed and Augmented Reality\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISMAR.2011.6143485\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 10th IEEE International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2011.6143485","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Tightly-coupled robust vision aided inertial navigation algorithm for augmented reality using monocular camera and IMU
Odometry component of a camera tracking system for augmented reality applications is described. The system uses a MEMS-type inertial measurement unit (IMU) with 3-axis gyroscopes and accelerometers and a monocular camera to accurately and robustly track the camera motion in 6 degrees of freedom (with correct scale) in arbitrary indoor or outdoor scenes. Tight coupling of IMU and camera is achieved by an error-state extended Kalman filter (EKF) which performs sensor fusion for inertial navigation at a deep level such that each visually tracked feature contributes as an individual measurement as opposed to the more traditional approaches where camera pose estimates are first extracted by means of feature tracking and then used as measurement updates in a filter framework. Robustness, on the other hand, is achieved by using a geometric hypothesize-and-test architecture based on the five-point relative pose estimation method, rather than a Mahalanobis distance type gating mechanism derived from the Kalman filter state prediction, to select the inlier tracks and remove outliers from the raw feature point matches which would otherwise corrupt the filter since tracks are directly used as measurements.