{"title":"Field Test Validations of Vision-based Multi-camera Multi-drone Tracking and 3D Localizing with Concurrent Camera Pose Estimation","authors":"Niven Jun Liang Sie, S. Srigrarom, Sunan Huang","doi":"10.1109/ICCRE51898.2021.9435654","DOIUrl":null,"url":null,"abstract":"This paper reports the field test validations of the recently proposed vision-based real-time multi-camera setups for detecting, tracking and 3D localizing multiple aerial targets (mainly drones). We also propose the additional concurrent camera pose estimation when the camera poses are not known beforehand. This extra step can be used alongside (in parallel) with the drone tracking and localizing process. We conducted flight tests using 2 drones flying in 2 specific scenarios, and used 3 cameras to observe, detect, track and locate the positions of both drones in global frame. The efficacy of our technique is measured by the accuracy of the temporal and spatial positions of the observed drones, against the drones’ own GPS recordings. Our initial results show reasonable accuracy, i.e. ±1m at 50m, as such, the proposed vision-based methods can be used for drone detection and tracking.","PeriodicalId":382619,"journal":{"name":"2021 6th International Conference on Control and Robotics Engineering (ICCRE)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 6th International Conference on Control and Robotics Engineering (ICCRE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCRE51898.2021.9435654","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
This paper reports the field test validations of the recently proposed vision-based real-time multi-camera setups for detecting, tracking and 3D localizing multiple aerial targets (mainly drones). We also propose the additional concurrent camera pose estimation when the camera poses are not known beforehand. This extra step can be used alongside (in parallel) with the drone tracking and localizing process. We conducted flight tests using 2 drones flying in 2 specific scenarios, and used 3 cameras to observe, detect, track and locate the positions of both drones in global frame. The efficacy of our technique is measured by the accuracy of the temporal and spatial positions of the observed drones, against the drones’ own GPS recordings. Our initial results show reasonable accuracy, i.e. ±1m at 50m, as such, the proposed vision-based methods can be used for drone detection and tracking.