Correction: Comparison of RADAR, Passive Optical with Acoustic, and Fused Multi-Modal Active and Passive Sensing for UAS Traffic Management Compliance and Urban Air Mobility Safety
Sam Siewert, M. Andalibi, S. Bruder, Jonathan M. Buchholz, Doug Chamberlain, Alexandra L. Lindsey, Trevis Shiroma, David Stockhouse
{"title":"Correction: Comparison of RADAR, Passive Optical with Acoustic, and Fused Multi-Modal Active and Passive Sensing for UAS Traffic Management Compliance and Urban Air Mobility Safety","authors":"Sam Siewert, M. Andalibi, S. Bruder, Jonathan M. Buchholz, Doug Chamberlain, Alexandra L. Lindsey, Trevis Shiroma, David Stockhouse","doi":"10.2514/6.2020-1456.c1","DOIUrl":null,"url":null,"abstract":"Embry Riddle Aeronautical University Prescott has designed and prototyped a small UAS (Unmanned Aerial System) detection, tracking, classification and identification system for UTM (UAS Traffic Management) compliance verification and UAM (Urban Air Mobility). This system primarily uses passive optical and acoustic sensors along with GNSS (Global Navigation Satellite Systems), or GPS (Global Positioning System) and ADS-B (Automatic Dependent Surveillance Broadcast). The system design, known as Drone Net, is a network of passive sensors designed to cover a kilometer square area. In this paper we present the results of experiments to add an active RADAR (Radio Detection and Ranging) on the ground and active LIDAR (Light Detection and Ranging) on flight nodes (small UAS). System tests completed previously have shown feasibility for all passive optical and acoustic UTM and UAM. The point of the RADAR testing is to directly compare passive sensor networks to active to determine the value of each alone and to test the hypothesis that multimodal active and passive sensing will be superior, but likely at higher cost than passive alone. Previously, we completed coordinated experiments with an EO/IR camera system and acoustic sensor network with an Allsky hemispherical six-camera system with resolution up to 12 million pixels to optimize detection and tracking. For some applications such as corporate campuses and Class-D airports, passive sensing alone might be sufficient and most cost effective, but for urban, military, and higher traffic Class-C and Class-B larger airports, we believe the combined multi-modal purpose-built sUAS RADAR integrated with optical and acoustic sensor networks will be most effective. Based upon our preliminary results presented herein, purpose built small RADAR with wide fields of view configured into an All-sky active system along with optical All-sky cameras and acoustic sensors are ideal for urban locations requiring the highest confidence in monitoring of combined UTM, UAM and GA (General Aviation) traffic. UTM will mostly be small UAS delivering parcels and must share airspace with UAM carrying passengers for short-hop transportation along with general aviation covering longer distances, but also entering urban airspace at airports. Clearly the airspace in urban locations is going to become much more congested, with new risk, but also new opportunity to improve transportation overall. The results we present here are a start at answering basic questions about multi-modal sensor effectiveness for urban navigation, both passive and active. In scenarios where sUAS and UAM may not have reliable GNSS or might not be UTM compliant, the ground detection, tracking and localization is most critical for assured urban airspace safety and security.","PeriodicalId":93413,"journal":{"name":"Applied aerodynamics : papers presented at the AIAA SciTech Forum and Exposition 2020 : Orlando, Florida, USA, 6-10 January 2020. AIAA SciTech Forum and Exposition (2020 : Orlando, Fla.)","volume":"26 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2020-01-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied aerodynamics : papers presented at the AIAA SciTech Forum and Exposition 2020 : Orlando, Florida, USA, 6-10 January 2020. AIAA SciTech Forum and Exposition (2020 : Orlando, Fla.)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2514/6.2020-1456.c1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Embry Riddle Aeronautical University Prescott has designed and prototyped a small UAS (Unmanned Aerial System) detection, tracking, classification and identification system for UTM (UAS Traffic Management) compliance verification and UAM (Urban Air Mobility). This system primarily uses passive optical and acoustic sensors along with GNSS (Global Navigation Satellite Systems), or GPS (Global Positioning System) and ADS-B (Automatic Dependent Surveillance Broadcast). The system design, known as Drone Net, is a network of passive sensors designed to cover a kilometer square area. In this paper we present the results of experiments to add an active RADAR (Radio Detection and Ranging) on the ground and active LIDAR (Light Detection and Ranging) on flight nodes (small UAS). System tests completed previously have shown feasibility for all passive optical and acoustic UTM and UAM. The point of the RADAR testing is to directly compare passive sensor networks to active to determine the value of each alone and to test the hypothesis that multimodal active and passive sensing will be superior, but likely at higher cost than passive alone. Previously, we completed coordinated experiments with an EO/IR camera system and acoustic sensor network with an Allsky hemispherical six-camera system with resolution up to 12 million pixels to optimize detection and tracking. For some applications such as corporate campuses and Class-D airports, passive sensing alone might be sufficient and most cost effective, but for urban, military, and higher traffic Class-C and Class-B larger airports, we believe the combined multi-modal purpose-built sUAS RADAR integrated with optical and acoustic sensor networks will be most effective. Based upon our preliminary results presented herein, purpose built small RADAR with wide fields of view configured into an All-sky active system along with optical All-sky cameras and acoustic sensors are ideal for urban locations requiring the highest confidence in monitoring of combined UTM, UAM and GA (General Aviation) traffic. UTM will mostly be small UAS delivering parcels and must share airspace with UAM carrying passengers for short-hop transportation along with general aviation covering longer distances, but also entering urban airspace at airports. Clearly the airspace in urban locations is going to become much more congested, with new risk, but also new opportunity to improve transportation overall. The results we present here are a start at answering basic questions about multi-modal sensor effectiveness for urban navigation, both passive and active. In scenarios where sUAS and UAM may not have reliable GNSS or might not be UTM compliant, the ground detection, tracking and localization is most critical for assured urban airspace safety and security.