A versatile real-time vision-led runway localisation system for enhanced autonomy.

IF 2.9 Q2 ROBOTICS Frontiers in Robotics and AI Pub Date : 2024-12-06 eCollection Date: 2024-01-01 DOI:10.3389/frobt.2024.1490812
Kyriacos Tsapparellas, Nickolay Jelev, Jonathon Waters, Aditya M Shrikhande, Sabine Brunswicker, Lyudmila S Mihaylova
{"title":"A versatile real-time vision-led runway localisation system for enhanced autonomy.","authors":"Kyriacos Tsapparellas, Nickolay Jelev, Jonathon Waters, Aditya M Shrikhande, Sabine Brunswicker, Lyudmila S Mihaylova","doi":"10.3389/frobt.2024.1490812","DOIUrl":null,"url":null,"abstract":"<p><p>This paper proposes a solution to the challenging task of autonomously landing Unmanned Aerial Vehicles (UAVs). An onboard computer vision module integrates the vision system with the ground control communication and video server connection. The vision platform performs feature extraction using the Speeded Up Robust Features (SURF), followed by fast Structured Forests edge detection and then smoothing with a Kalman filter for accurate runway sidelines prediction. A thorough evaluation is performed over real-world and simulation environments with respect to accuracy and processing time, in comparison with state-of-the-art edge detection approaches. The vision system is validated over videos with clear and difficult weather conditions, including with fog, varying lighting conditions and crosswind landing. The experiments are performed using data from the X-Plane 11 flight simulator and real flight data from the Uncrewed Low-cost TRAnsport (ULTRA) self-flying cargo UAV. The vision-led system can localise the runway sidelines with a Structured Forests approach with an accuracy approximately 84.4%, outperforming the state-of-the-art approaches and delivering real-time performance. The main contribution of this work consists of the developed vision-led system for runway detection to aid autonomous landing of UAVs using electro-optical cameras. Although implemented with the ULTRA UAV, the vision-led system is applicable to any other UAV.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1490812"},"PeriodicalIF":2.9000,"publicationDate":"2024-12-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11660180/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Robotics and AI","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frobt.2024.1490812","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

This paper proposes a solution to the challenging task of autonomously landing Unmanned Aerial Vehicles (UAVs). An onboard computer vision module integrates the vision system with the ground control communication and video server connection. The vision platform performs feature extraction using the Speeded Up Robust Features (SURF), followed by fast Structured Forests edge detection and then smoothing with a Kalman filter for accurate runway sidelines prediction. A thorough evaluation is performed over real-world and simulation environments with respect to accuracy and processing time, in comparison with state-of-the-art edge detection approaches. The vision system is validated over videos with clear and difficult weather conditions, including with fog, varying lighting conditions and crosswind landing. The experiments are performed using data from the X-Plane 11 flight simulator and real flight data from the Uncrewed Low-cost TRAnsport (ULTRA) self-flying cargo UAV. The vision-led system can localise the runway sidelines with a Structured Forests approach with an accuracy approximately 84.4%, outperforming the state-of-the-art approaches and delivering real-time performance. The main contribution of this work consists of the developed vision-led system for runway detection to aid autonomous landing of UAVs using electro-optical cameras. Although implemented with the ULTRA UAV, the vision-led system is applicable to any other UAV.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一个多功能实时视觉导向的跑道定位系统,增强了自主性。
针对无人机自主着陆这一具有挑战性的任务,提出了一种解决方案。机载计算机视觉模块集成了视觉系统与地面控制通信和视频服务器连接。视觉平台使用加速鲁棒特征(SURF)进行特征提取,然后进行快速结构化森林边缘检测,然后使用卡尔曼滤波器进行平滑,以准确预测跑道边线。与最先进的边缘检测方法相比,在真实世界和模拟环境中对准确性和处理时间进行了全面的评估。视觉系统在晴朗和恶劣的天气条件下进行了视频验证,包括大雾、不同的照明条件和侧风着陆。实验使用来自X-Plane 11飞行模拟器的数据和来自无人驾驶低成本运输(ULTRA)自动飞行货运无人机的真实飞行数据进行。视觉主导的系统可以使用结构化森林方法定位跑道边缘,准确率约为84.4%,优于最先进的方法,并提供实时性能。这项工作的主要贡献包括开发用于跑道探测的视觉引导系统,以帮助使用光电摄像机的无人机自主着陆。虽然与ULTRA无人机一起实现,但视觉引导系统适用于任何其他无人机。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.50
自引率
5.90%
发文量
355
审稿时长
14 weeks
期刊介绍: Frontiers in Robotics and AI publishes rigorously peer-reviewed research covering all theory and applications of robotics, technology, and artificial intelligence, from biomedical to space robotics.
期刊最新文献
Advanced robotics for automated EV battery testing using electrochemical impedance spectroscopy. Pig tongue soft robot mimicking intrinsic tongue muscle structure. A fast monocular 6D pose estimation method for textureless objects based on perceptual hashing and template matching. Semantic segmentation using synthetic images of underwater marine-growth. A comparative psychological evaluation of a robotic avatar in Dubai and Japan.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1