A General Framework for Human-Drone Interaction under Limited On-board Sensing

S. Nayhouse, S. Chadha, P. Hourican, C. Moore, N. Bezzo
{"title":"A General Framework for Human-Drone Interaction under Limited On-board Sensing","authors":"S. Nayhouse, S. Chadha, P. Hourican, C. Moore, N. Bezzo","doi":"10.1109/SIEDS58326.2023.10137774","DOIUrl":null,"url":null,"abstract":"Recent advancements in unmanned aerial vehicles (UAVs), has allowed their deployment for numerous applications like aerial photography, infrastructure inspection, search and rescue, and surveillance. Despite the potential for full autonomy, many applications still necessitate human operators for navigating complex environments and decision-making. Existing solutions often employ high-precision and simple sensors like 2-D or 3-D LiDAR, which may provide more data than necessary and contribute to increased system complexity and cost. To address these challenges and bridge the gap between full autonomy and human-controlled UAVs, this work develops a shared-autonomy framework for UAVs, leveraging lightweight, low-cost 1-D LiDAR sensors combined with mobility behaviors to obtain performance comparable to more advanced 2-D/3-D LiDAR sensors while minimizing energy, computation overhead, and weight. Our framework includes a novel state machine method that exploits the UAV mobility to compensate for the limitations of 1-D LiDAR sensors, ensuring safety and obstacle avoidance through a physics-based algorithm that transitions between teleoperation and autonomous mode as needed based on environmental conditions and safety-critical issues. Experimental validations on real UAVs demonstrates the effectiveness of this shared autonomy scheme in complex environments, and the system is further generalized to larger UAVs and prototyped with a custom sensor configuration and onboard obstacle avoidance.","PeriodicalId":267464,"journal":{"name":"2023 Systems and Information Engineering Design Symposium (SIEDS)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-04-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 Systems and Information Engineering Design Symposium (SIEDS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SIEDS58326.2023.10137774","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Recent advancements in unmanned aerial vehicles (UAVs), has allowed their deployment for numerous applications like aerial photography, infrastructure inspection, search and rescue, and surveillance. Despite the potential for full autonomy, many applications still necessitate human operators for navigating complex environments and decision-making. Existing solutions often employ high-precision and simple sensors like 2-D or 3-D LiDAR, which may provide more data than necessary and contribute to increased system complexity and cost. To address these challenges and bridge the gap between full autonomy and human-controlled UAVs, this work develops a shared-autonomy framework for UAVs, leveraging lightweight, low-cost 1-D LiDAR sensors combined with mobility behaviors to obtain performance comparable to more advanced 2-D/3-D LiDAR sensors while minimizing energy, computation overhead, and weight. Our framework includes a novel state machine method that exploits the UAV mobility to compensate for the limitations of 1-D LiDAR sensors, ensuring safety and obstacle avoidance through a physics-based algorithm that transitions between teleoperation and autonomous mode as needed based on environmental conditions and safety-critical issues. Experimental validations on real UAVs demonstrates the effectiveness of this shared autonomy scheme in complex environments, and the system is further generalized to larger UAVs and prototyped with a custom sensor configuration and onboard obstacle avoidance.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
有限机载传感条件下人机交互的一般框架
无人机(uav)的最新进展使其能够部署在航空摄影,基础设施检查,搜索和救援以及监视等众多应用中。尽管具有完全自主的潜力,但许多应用仍然需要人类操作员来导航复杂的环境和做出决策。现有的解决方案通常采用高精度和简单的传感器,如2-D或3-D激光雷达,这可能会提供比必要更多的数据,并增加系统的复杂性和成本。为了解决这些挑战并弥合完全自主和人类控制的无人机之间的差距,本研究为无人机开发了一个共享自主框架,利用轻量级、低成本的1-D激光雷达传感器与机动性行为相结合,获得与更先进的2-D/3-D激光雷达传感器相当的性能,同时最大限度地减少能量、计算开销和重量。我们的框架包括一种新的状态机方法,该方法利用无人机的机动性来弥补1-D激光雷达传感器的局限性,通过基于物理的算法确保安全和避障,该算法可根据环境条件和安全关键问题在远程操作和自主模式之间根据需要转换。在实际无人机上的实验验证证明了该共享自主方案在复杂环境下的有效性,并将该系统进一步推广到更大的无人机上,并采用自定义传感器配置和机载避障功能进行了原型设计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Car Wash Deterrent System Solar-Powered Refrigerator on Wheels: An Engineering Design Challenge Adapting Hydropower Operations to Support Renewable Energy Transitions and Freshwater Sustainability in the Columbia River Basin Methods for the Spatial Analysis of Invasive Species and Ecosystem Fragmentation at Conservation Sites in Malta Using Drones * Investigating the Stability of Organic Materials for Commercial Dyeing
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1