{"title":"Conditional Probabilistic Relative Visual Localization for Unmanned Aerial Vehicles","authors":"Andy Couturier, M. Akhloufi","doi":"10.1109/CCECE47787.2020.9255691","DOIUrl":null,"url":null,"abstract":"Unmanned aerial vehicles (UAV) are now used for a large number of applications in everyday life. These applications require autonomous navigation which is enabled by the self-localization solution integrated to the UAV. To perform self-localization, most UAVs are relying on a series of sensors combined with a global navigation satellite system (GNSS) in a sensor fusion framework. However, GNSS are using radio signals which are subjected to a large range of outages and interferences. This paper presents a relative visual localization (RVL) approach for GPS-denied environments using a down-facing 2D monocular camera and an inertial measurement unit (IMU). The solution is embedded in an adapted particle filter and use feature points to match images and estimate the localization of the UAV. A new conditional RVL measure is developed in order to leverage spare computation resources available during the data collection when the UAV is still receiving a GNSS signal. An evaluation of six feature point extraction methods is performed using real-world data while varying the number of feature points extracted. The results are promising and the approach has shown to be more efficient and to have fewer limitations than similar approaches in the literature.","PeriodicalId":296506,"journal":{"name":"2020 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE)","volume":"61 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCECE47787.2020.9255691","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Unmanned aerial vehicles (UAV) are now used for a large number of applications in everyday life. These applications require autonomous navigation which is enabled by the self-localization solution integrated to the UAV. To perform self-localization, most UAVs are relying on a series of sensors combined with a global navigation satellite system (GNSS) in a sensor fusion framework. However, GNSS are using radio signals which are subjected to a large range of outages and interferences. This paper presents a relative visual localization (RVL) approach for GPS-denied environments using a down-facing 2D monocular camera and an inertial measurement unit (IMU). The solution is embedded in an adapted particle filter and use feature points to match images and estimate the localization of the UAV. A new conditional RVL measure is developed in order to leverage spare computation resources available during the data collection when the UAV is still receiving a GNSS signal. An evaluation of six feature point extraction methods is performed using real-world data while varying the number of feature points extracted. The results are promising and the approach has shown to be more efficient and to have fewer limitations than similar approaches in the literature.