{"title":"用于跟踪攻击运动的密集视觉惯性里程计","authors":"Yonggen Ling, S. Shen","doi":"10.1109/ROBIO.2015.7418830","DOIUrl":null,"url":null,"abstract":"We propose a sliding window-based dense visual-inertial fusion method for real-time tracking of challenging aggressive motions. Our method combines recent advances in direct dense visual odometry, inertial measurement unit (IMU) preintegration, and graph-based optimization. At the front-end, direct dense visual odometry provides camera pose tracking that is resistant to motion blur. At the back-end, a sliding window optimization-based fusion framework with efficient IMU preintegration generates smooth and high-accuracy state estimates, even with occasional visual tracking failures. A local loop closure that is integrated into the back-end further eliminates drift after extremely aggressive motions. Our system runs real-time at 25 Hz on an off-the-shelf laptop. Experimental results show that our method is able to accurately track motions with angular velocities up to 1000 degrees/s and velocities up to 4 m/s. We also compare our method with state-of-the-art systems, such as Google Tango, and show superior performance during challenging motions. We show that our method achieves reliable tracking results, even if we throw the sensor suite during experiments.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Dense visual-inertial odometry for tracking of aggressive motions\",\"authors\":\"Yonggen Ling, S. Shen\",\"doi\":\"10.1109/ROBIO.2015.7418830\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose a sliding window-based dense visual-inertial fusion method for real-time tracking of challenging aggressive motions. Our method combines recent advances in direct dense visual odometry, inertial measurement unit (IMU) preintegration, and graph-based optimization. At the front-end, direct dense visual odometry provides camera pose tracking that is resistant to motion blur. At the back-end, a sliding window optimization-based fusion framework with efficient IMU preintegration generates smooth and high-accuracy state estimates, even with occasional visual tracking failures. A local loop closure that is integrated into the back-end further eliminates drift after extremely aggressive motions. Our system runs real-time at 25 Hz on an off-the-shelf laptop. Experimental results show that our method is able to accurately track motions with angular velocities up to 1000 degrees/s and velocities up to 4 m/s. We also compare our method with state-of-the-art systems, such as Google Tango, and show superior performance during challenging motions. We show that our method achieves reliable tracking results, even if we throw the sensor suite during experiments.\",\"PeriodicalId\":325536,\"journal\":{\"name\":\"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"volume\":\"63 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROBIO.2015.7418830\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO.2015.7418830","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Dense visual-inertial odometry for tracking of aggressive motions
We propose a sliding window-based dense visual-inertial fusion method for real-time tracking of challenging aggressive motions. Our method combines recent advances in direct dense visual odometry, inertial measurement unit (IMU) preintegration, and graph-based optimization. At the front-end, direct dense visual odometry provides camera pose tracking that is resistant to motion blur. At the back-end, a sliding window optimization-based fusion framework with efficient IMU preintegration generates smooth and high-accuracy state estimates, even with occasional visual tracking failures. A local loop closure that is integrated into the back-end further eliminates drift after extremely aggressive motions. Our system runs real-time at 25 Hz on an off-the-shelf laptop. Experimental results show that our method is able to accurately track motions with angular velocities up to 1000 degrees/s and velocities up to 4 m/s. We also compare our method with state-of-the-art systems, such as Google Tango, and show superior performance during challenging motions. We show that our method achieves reliable tracking results, even if we throw the sensor suite during experiments.