MARS-LVIG dataset: A multi-sensor aerial robots SLAM dataset for LiDAR-visual-inertial-GNSS fusion

Haotian Li, Yuying Zou, Nan Chen, Jiarong Lin, Xiyuan Liu, Wei Xu, Chunran Zheng, Rundong Li, Dongjiao He, Fanze Kong, Yixi Cai, Zheng Liu, Shunbo Zhou, Kaiwen Xue, Fu Zhang
{"title":"MARS-LVIG dataset: A multi-sensor aerial robots SLAM dataset for LiDAR-visual-inertial-GNSS fusion","authors":"Haotian Li, Yuying Zou, Nan Chen, Jiarong Lin, Xiyuan Liu, Wei Xu, Chunran Zheng, Rundong Li, Dongjiao He, Fanze Kong, Yixi Cai, Zheng Liu, Shunbo Zhou, Kaiwen Xue, Fu Zhang","doi":"10.1177/02783649241227968","DOIUrl":null,"url":null,"abstract":"In recent years, advancements in Light Detection and Ranging (LiDAR) technology have made 3D LiDAR sensors more compact, lightweight, and affordable. This progress has spurred interest in integrating LiDAR with sensors such as Inertial Measurement Units (IMUs) and cameras for Simultaneous Localization and Mapping (SLAM) research. Public datasets covering different scenarios, platforms, and viewpoints are crucial for multi-sensor fusion SLAM studies, yet most focus on handheld or vehicle-mounted devices with front or 360-degree views. Data from aerial vehicles with downward-looking views is scarce, existing relevant datasets usually feature low altitudes and are mostly limited to small campus environments. To fill this gap, we introduce the Multi-sensor Aerial Robots SLAM dataset (MARS-LVIG dataset), providing unique aerial downward-looking LiDAR-Visual-Inertial-GNSS data with viewpoints from altitudes between 80 m and 130 m. The dataset not only offers new aspects to test and evaluate existing SLAM algorithms, but also brings new challenges which can facilitate researches and developments of more advanced SLAM algorithms. The MARS-LVIG dataset contains 21 sequences, acquired across diversified large-area environments including an aero-model airfield, an island, a rural town, and a valley. Within these sequences, the UAV has speeds varying from 3 m/s to 12 m/s, a scanning area reaching up to 577,000 m2, and the max path length of 7.148 km in a single flight. This dataset encapsulates data collected by a lightweight, hardware-synchronized sensor package that includes a solid-state 3D LiDAR, a global-shutter RGB camera, IMUs, and a raw message receiver of the Global Navigation Satellite System (GNSS). For algorithm evaluation, this dataset releases ground truth of both localization and mapping, which are acquired by on-board Real-time Kinematic (RTK) and DJI L1 (post-processed by its supporting software DJI Terra), respectively. The dataset can be downloaded from: https://mars.hku.hk/dataset.html .","PeriodicalId":501362,"journal":{"name":"The International Journal of Robotics Research","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The International Journal of Robotics Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/02783649241227968","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, advancements in Light Detection and Ranging (LiDAR) technology have made 3D LiDAR sensors more compact, lightweight, and affordable. This progress has spurred interest in integrating LiDAR with sensors such as Inertial Measurement Units (IMUs) and cameras for Simultaneous Localization and Mapping (SLAM) research. Public datasets covering different scenarios, platforms, and viewpoints are crucial for multi-sensor fusion SLAM studies, yet most focus on handheld or vehicle-mounted devices with front or 360-degree views. Data from aerial vehicles with downward-looking views is scarce, existing relevant datasets usually feature low altitudes and are mostly limited to small campus environments. To fill this gap, we introduce the Multi-sensor Aerial Robots SLAM dataset (MARS-LVIG dataset), providing unique aerial downward-looking LiDAR-Visual-Inertial-GNSS data with viewpoints from altitudes between 80 m and 130 m. The dataset not only offers new aspects to test and evaluate existing SLAM algorithms, but also brings new challenges which can facilitate researches and developments of more advanced SLAM algorithms. The MARS-LVIG dataset contains 21 sequences, acquired across diversified large-area environments including an aero-model airfield, an island, a rural town, and a valley. Within these sequences, the UAV has speeds varying from 3 m/s to 12 m/s, a scanning area reaching up to 577,000 m2, and the max path length of 7.148 km in a single flight. This dataset encapsulates data collected by a lightweight, hardware-synchronized sensor package that includes a solid-state 3D LiDAR, a global-shutter RGB camera, IMUs, and a raw message receiver of the Global Navigation Satellite System (GNSS). For algorithm evaluation, this dataset releases ground truth of both localization and mapping, which are acquired by on-board Real-time Kinematic (RTK) and DJI L1 (post-processed by its supporting software DJI Terra), respectively. The dataset can be downloaded from: https://mars.hku.hk/dataset.html .
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
MARS-LVIG 数据集:用于激光雷达-视觉-惯性-GNSS 融合的多传感器空中机器人 SLAM 数据集
近年来,光探测与测距(LiDAR)技术的进步使 3D LiDAR 传感器变得更加紧凑、轻便和经济。这一进步激发了人们将激光雷达与惯性测量单元(IMU)和相机等传感器集成起来进行同步定位和绘图(SLAM)研究的兴趣。涵盖不同场景、平台和视角的公共数据集对多传感器融合 SLAM 研究至关重要,但大多数数据集都集中在具有前视或 360 度视角的手持或车载设备上。具有俯视视角的航空飞行器数据非常稀少,现有的相关数据集通常高度较低,且大多局限于小型校园环境。为了填补这一空白,我们引入了多传感器空中机器人SLAM数据集(MARS-LVIG数据集),该数据集提供了独特的空中下视LiDAR-Visual-Inertial-GNSS数据,视点高度在80米至130米之间。MARS-LVIG 数据集包含 21 个序列,在不同的大面积环境中采集,包括一个航空模型机场、一个岛屿、一个乡村小镇和一个山谷。在这些序列中,无人机的速度从 3 米/秒到 12 米/秒不等,扫描面积达 577,000 平方米,单次飞行的最大路径长度为 7.148 千米。该数据集囊括了由轻型硬件同步传感器包收集的数据,其中包括固态 3D 激光雷达、全局快门 RGB 摄像机、IMU 和全球导航卫星系统 (GNSS) 原始信息接收器。为进行算法评估,该数据集发布了定位和测绘的地面实况,分别由机载实时运动学(RTK)和大疆 L1(由其支持软件 DJI Terra 进行后处理)获取。该数据集可从以下网址下载: https://mars.hku.hk/dataset.html 。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Transfer learning in robotics: An upcoming breakthrough? A review of promises and challenges Selected papers from WAFR 2022 Continuum concentric push–pull robots: A Cosserat rod model Sim-to-real transfer of adaptive control parameters for AUV stabilisation under current disturbance No compromise in solution quality: Speeding up belief-dependent continuous partially observable Markov decision processes via adaptive multilevel simplification
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1