FIReStereo: Forest InfraRed Stereo Dataset for UAS Depth Perception in Visually Degraded Environments

Devansh Dhrafani, Yifei Liu, Andrew Jong, Ukcheol Shin, Yao He, Tyler Harp, Yaoyu Hu, Jean Oh, Sebastian Scherer
{"title":"FIReStereo: Forest InfraRed Stereo Dataset for UAS Depth Perception in Visually Degraded Environments","authors":"Devansh Dhrafani, Yifei Liu, Andrew Jong, Ukcheol Shin, Yao He, Tyler Harp, Yaoyu Hu, Jean Oh, Sebastian Scherer","doi":"arxiv-2409.07715","DOIUrl":null,"url":null,"abstract":"Robust depth perception in visually-degraded environments is crucial for\nautonomous aerial systems. Thermal imaging cameras, which capture infrared\nradiation, are robust to visual degradation. However, due to lack of a\nlarge-scale dataset, the use of thermal cameras for unmanned aerial system\n(UAS) depth perception has remained largely unexplored. This paper presents a\nstereo thermal depth perception dataset for autonomous aerial perception\napplications. The dataset consists of stereo thermal images, LiDAR, IMU and\nground truth depth maps captured in urban and forest settings under diverse\nconditions like day, night, rain, and smoke. We benchmark representative stereo\ndepth estimation algorithms, offering insights into their performance in\ndegraded conditions. Models trained on our dataset generalize well to unseen\nsmoky conditions, highlighting the robustness of stereo thermal imaging for\ndepth perception. We aim for this work to enhance robotic perception in\ndisaster scenarios, allowing for exploration and operations in previously\nunreachable areas. The dataset and source code are available at\nhttps://firestereo.github.io.","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07715","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Robust depth perception in visually-degraded environments is crucial for autonomous aerial systems. Thermal imaging cameras, which capture infrared radiation, are robust to visual degradation. However, due to lack of a large-scale dataset, the use of thermal cameras for unmanned aerial system (UAS) depth perception has remained largely unexplored. This paper presents a stereo thermal depth perception dataset for autonomous aerial perception applications. The dataset consists of stereo thermal images, LiDAR, IMU and ground truth depth maps captured in urban and forest settings under diverse conditions like day, night, rain, and smoke. We benchmark representative stereo depth estimation algorithms, offering insights into their performance in degraded conditions. Models trained on our dataset generalize well to unseen smoky conditions, highlighting the robustness of stereo thermal imaging for depth perception. We aim for this work to enhance robotic perception in disaster scenarios, allowing for exploration and operations in previously unreachable areas. The dataset and source code are available at https://firestereo.github.io.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
FIReStereo:用于视觉退化环境中无人机系统深度感知的森林红外立体数据集
在视觉衰减的环境中保持稳定的深度知觉对自主飞行系统至关重要。红外热像仪能捕捉红外辐射,对视觉衰减具有很强的抗干扰能力。然而,由于缺乏大规模的数据集,热像仪在无人机系统(UAS)深度感知方面的应用在很大程度上仍未得到探索。本文介绍了用于自主航空感知应用的立体红外深度感知数据集。该数据集由立体热图像、激光雷达、IMU 和地面实况深度图组成,这些图像是在白天、夜晚、雨天和烟雾等不同条件下在城市和森林环境中拍摄的。我们对具有代表性的立体深度估算算法进行了基准测试,以便深入了解这些算法在不同条件下的性能。在我们的数据集上训练的模型能够很好地泛化到非烟雾条件下,突出了立体热成像在深度感知方面的鲁棒性。我们希望这项工作能增强机器人在灾难场景中的感知能力,从而在以前无法到达的区域进行探索和作业。数据集和源代码可在https://firestereo.github.io。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
IMRL: Integrating Visual, Physical, Temporal, and Geometric Representations for Enhanced Food Acquisition Human-Robot Cooperative Piano Playing with Learning-Based Real-Time Music Accompaniment GauTOAO: Gaussian-based Task-Oriented Affordance of Objects Reinforcement Learning with Lie Group Orientations for Robotics Haptic-ACT: Bridging Human Intuition with Compliant Robotic Manipulation via Immersive VR
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1