Devansh Dhrafani, Yifei Liu, Andrew Jong, Ukcheol Shin, Yao He, Tyler Harp, Yaoyu Hu, Jean Oh, Sebastian Scherer
{"title":"FIReStereo: Forest InfraRed Stereo Dataset for UAS Depth Perception in Visually Degraded Environments","authors":"Devansh Dhrafani, Yifei Liu, Andrew Jong, Ukcheol Shin, Yao He, Tyler Harp, Yaoyu Hu, Jean Oh, Sebastian Scherer","doi":"arxiv-2409.07715","DOIUrl":null,"url":null,"abstract":"Robust depth perception in visually-degraded environments is crucial for\nautonomous aerial systems. Thermal imaging cameras, which capture infrared\nradiation, are robust to visual degradation. However, due to lack of a\nlarge-scale dataset, the use of thermal cameras for unmanned aerial system\n(UAS) depth perception has remained largely unexplored. This paper presents a\nstereo thermal depth perception dataset for autonomous aerial perception\napplications. The dataset consists of stereo thermal images, LiDAR, IMU and\nground truth depth maps captured in urban and forest settings under diverse\nconditions like day, night, rain, and smoke. We benchmark representative stereo\ndepth estimation algorithms, offering insights into their performance in\ndegraded conditions. Models trained on our dataset generalize well to unseen\nsmoky conditions, highlighting the robustness of stereo thermal imaging for\ndepth perception. We aim for this work to enhance robotic perception in\ndisaster scenarios, allowing for exploration and operations in previously\nunreachable areas. The dataset and source code are available at\nhttps://firestereo.github.io.","PeriodicalId":501031,"journal":{"name":"arXiv - CS - Robotics","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Robotics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07715","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Robust depth perception in visually-degraded environments is crucial for
autonomous aerial systems. Thermal imaging cameras, which capture infrared
radiation, are robust to visual degradation. However, due to lack of a
large-scale dataset, the use of thermal cameras for unmanned aerial system
(UAS) depth perception has remained largely unexplored. This paper presents a
stereo thermal depth perception dataset for autonomous aerial perception
applications. The dataset consists of stereo thermal images, LiDAR, IMU and
ground truth depth maps captured in urban and forest settings under diverse
conditions like day, night, rain, and smoke. We benchmark representative stereo
depth estimation algorithms, offering insights into their performance in
degraded conditions. Models trained on our dataset generalize well to unseen
smoky conditions, highlighting the robustness of stereo thermal imaging for
depth perception. We aim for this work to enhance robotic perception in
disaster scenarios, allowing for exploration and operations in previously
unreachable areas. The dataset and source code are available at
https://firestereo.github.io.