{"title":"3D Visual Homing for Commodity UAVs","authors":"Hao Cai, Sipan Ye, A. Vardy, Minglun Gong","doi":"10.1109/CRV.2018.00045","DOIUrl":null,"url":null,"abstract":"Visual homing enables an autonomous robot to move to a target (home) position using only visual information. While 2D visual homing has been widely studied, homing in 3D space still requires much attention. This paper presents a novel 3D visual homing method which can be applied to commodity Unmanned Aerial Vehicles (UAVs). Firstly, relative camera poses are estimated through feature correspondences between current views and the reference home image. Then homing vectors are computed and utilized to guide the UAV toward the 3D home location. All computations can be performed in real-time on mobile devices through a mobile app. To validate our approach, we conducted quantitative evaluations on the most popular image sequence datasets and performed real experiments on a quadcopter (i.e., DJI Mavic Pro). Experimental results demonstrate the effectiveness of the proposed method.","PeriodicalId":281779,"journal":{"name":"2018 15th Conference on Computer and Robot Vision (CRV)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 15th Conference on Computer and Robot Vision (CRV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CRV.2018.00045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Visual homing enables an autonomous robot to move to a target (home) position using only visual information. While 2D visual homing has been widely studied, homing in 3D space still requires much attention. This paper presents a novel 3D visual homing method which can be applied to commodity Unmanned Aerial Vehicles (UAVs). Firstly, relative camera poses are estimated through feature correspondences between current views and the reference home image. Then homing vectors are computed and utilized to guide the UAV toward the 3D home location. All computations can be performed in real-time on mobile devices through a mobile app. To validate our approach, we conducted quantitative evaluations on the most popular image sequence datasets and performed real experiments on a quadcopter (i.e., DJI Mavic Pro). Experimental results demonstrate the effectiveness of the proposed method.