{"title":"The Implementation of IMU/Stereo Vision Slam System for Mobile Robot","authors":"Hou Juan-Rou, Wang Zhan-qing","doi":"10.23919/icins43215.2020.9133980","DOIUrl":null,"url":null,"abstract":"In the research of unmanned vehicle technology, the integrated navigation system based on inertial measurement unit and stereo camera has gradually become a research hotspot. The inertial navigation system has the characteristics of higher short-time precision, and does not radiate information to the outside world. The stereo vision navigation system collects image information of the environment, and performs feature extraction and tracking on the feature points in the acquired images to recover the motion of the carrier. In this paper, the stereo vision navigation system is used to correct the long-term error accumulation of the inertial navigation system. On the other hand, the short-time precision of the inertial navigation system can also compensate the vision navigation system caused by the blurred image information caused by the carrier moving too fast. The integrated navigation system of IMU together with stereo vision camera can gain better comprehensive performance. In this paper, Multi-State Fusion Kalman Filter (MSF) and Multi-State Constraint Kalman Filter (MSCKF) are used to fuse the inertial navigation system and vision navigation system. When constructing the MSCKF framework, we use sparse optical flow method to achieve feature tracking, and using triangulation in computer vision to calculate the positions of feature points, and use the data set to verify the accuracy of the two algorithms. Finally, constructing an environmental point cloud map using the estimated state of feature points. Under the environment of i7-8750H cpu, the experimental results show that tight coupling is more accurate than loose coupling.","PeriodicalId":127936,"journal":{"name":"2020 27th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 27th Saint Petersburg International Conference on Integrated Navigation Systems (ICINS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/icins43215.2020.9133980","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
In the research of unmanned vehicle technology, the integrated navigation system based on inertial measurement unit and stereo camera has gradually become a research hotspot. The inertial navigation system has the characteristics of higher short-time precision, and does not radiate information to the outside world. The stereo vision navigation system collects image information of the environment, and performs feature extraction and tracking on the feature points in the acquired images to recover the motion of the carrier. In this paper, the stereo vision navigation system is used to correct the long-term error accumulation of the inertial navigation system. On the other hand, the short-time precision of the inertial navigation system can also compensate the vision navigation system caused by the blurred image information caused by the carrier moving too fast. The integrated navigation system of IMU together with stereo vision camera can gain better comprehensive performance. In this paper, Multi-State Fusion Kalman Filter (MSF) and Multi-State Constraint Kalman Filter (MSCKF) are used to fuse the inertial navigation system and vision navigation system. When constructing the MSCKF framework, we use sparse optical flow method to achieve feature tracking, and using triangulation in computer vision to calculate the positions of feature points, and use the data set to verify the accuracy of the two algorithms. Finally, constructing an environmental point cloud map using the estimated state of feature points. Under the environment of i7-8750H cpu, the experimental results show that tight coupling is more accurate than loose coupling.