{"title":"结合视觉和红外传感器的自主无人机鲁棒精确着陆","authors":"Giannis Badakis, Manos Koutsoubelias, S. Lalis","doi":"10.1109/SAS51076.2021.9530091","DOIUrl":null,"url":null,"abstract":"One of the challenges in drone-based systems is to support automated landing with a high precision that goes beyond the accuracy of standard off-the-shelf GPS. Various efforts have been made to support this, mainly using vision-based and infrared sensors. However, using a single sensor inevitably introduces a single point of failure. To address this problem, we combine a vision-based sensor that detects special visual markers with a sensor that tracks an infrared beacon. We also support a more cautious landing approach for the case where these sensors temporarily fail to detect their targets. We implement our solution in the context of a mature autopilot framework, through modular extensions that are transparent to the rest of the software stack. We evaluate these mechanisms by conducting field experiments using a custom drone, activating faults in the individual precision landing sensor subsystems in a controlled way through interactive commands that are sent to the drone at runtime. The results show that our solution achieves robust precision landing under different failure scenarios while maintaining the accuracy of fault-free sensor operation.","PeriodicalId":224327,"journal":{"name":"2021 IEEE Sensors Applications Symposium (SAS)","volume":"1146 ","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Robust Precision Landing for Autonomous Drones Combining Vision-based and Infrared Sensors\",\"authors\":\"Giannis Badakis, Manos Koutsoubelias, S. Lalis\",\"doi\":\"10.1109/SAS51076.2021.9530091\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"One of the challenges in drone-based systems is to support automated landing with a high precision that goes beyond the accuracy of standard off-the-shelf GPS. Various efforts have been made to support this, mainly using vision-based and infrared sensors. However, using a single sensor inevitably introduces a single point of failure. To address this problem, we combine a vision-based sensor that detects special visual markers with a sensor that tracks an infrared beacon. We also support a more cautious landing approach for the case where these sensors temporarily fail to detect their targets. We implement our solution in the context of a mature autopilot framework, through modular extensions that are transparent to the rest of the software stack. We evaluate these mechanisms by conducting field experiments using a custom drone, activating faults in the individual precision landing sensor subsystems in a controlled way through interactive commands that are sent to the drone at runtime. The results show that our solution achieves robust precision landing under different failure scenarios while maintaining the accuracy of fault-free sensor operation.\",\"PeriodicalId\":224327,\"journal\":{\"name\":\"2021 IEEE Sensors Applications Symposium (SAS)\",\"volume\":\"1146 \",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE Sensors Applications Symposium (SAS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SAS51076.2021.9530091\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Sensors Applications Symposium (SAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SAS51076.2021.9530091","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Robust Precision Landing for Autonomous Drones Combining Vision-based and Infrared Sensors
One of the challenges in drone-based systems is to support automated landing with a high precision that goes beyond the accuracy of standard off-the-shelf GPS. Various efforts have been made to support this, mainly using vision-based and infrared sensors. However, using a single sensor inevitably introduces a single point of failure. To address this problem, we combine a vision-based sensor that detects special visual markers with a sensor that tracks an infrared beacon. We also support a more cautious landing approach for the case where these sensors temporarily fail to detect their targets. We implement our solution in the context of a mature autopilot framework, through modular extensions that are transparent to the rest of the software stack. We evaluate these mechanisms by conducting field experiments using a custom drone, activating faults in the individual precision landing sensor subsystems in a controlled way through interactive commands that are sent to the drone at runtime. The results show that our solution achieves robust precision landing under different failure scenarios while maintaining the accuracy of fault-free sensor operation.