基于视觉-惯性距离传感器融合的多无人机姿态估计

Junho Choi, Christiansen Marsim Kevin, Myeongwoo Jeong, Kihwan Ryoo, Jeewon Kim, Hyun Myung
{"title":"基于视觉-惯性距离传感器融合的多无人机姿态估计","authors":"Junho Choi, Christiansen Marsim Kevin, Myeongwoo Jeong, Kihwan Ryoo, Jeewon Kim, Hyun Myung","doi":"10.5302/j.icros.2023.23.0135","DOIUrl":null,"url":null,"abstract":"Multi-robot state estimation is crucial for real-time and accurate operation, especially in complex environments where a global navigation satellite system cannot be used. Many researchers employ multiple sensor modalities, including cameras, LiDAR, and ultra-wideband (UWB), to achieve real-time state estimation. However, each sensor has specific requirements that might limit its usage. While LiDAR sensors demand a high payload capacity, camera sensors must have matching image features between robots, and UWB sensors require known fixed anchor locations for accurate positioning. This study introduces a robust localization system with a minimal sensor setup that eliminates the need for the previously mentioned requirements. We used an anchor-free UWB setup to establish a global coordinate system, unifying all robots. Each robot performs visual-inertial odometry to estimate its ego-motion in its local coordinate system. By optimizing the local odometry from each robot using inter-robot range measurements, the positions of the robots can be robustly estimated without relying on an extensive sensor setup or infrastructure. Our method offers a simple yet effective solution for achieving accurate and real-time multi-robot state estimation in challenging environments without relying on traditional sensor requirements.","PeriodicalId":38644,"journal":{"name":"Journal of Institute of Control, Robotics and Systems","volume":"36 2","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-unmanned Aerial Vehicle Pose Estimation Based on Visual-inertial-range Sensor Fusion\",\"authors\":\"Junho Choi, Christiansen Marsim Kevin, Myeongwoo Jeong, Kihwan Ryoo, Jeewon Kim, Hyun Myung\",\"doi\":\"10.5302/j.icros.2023.23.0135\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multi-robot state estimation is crucial for real-time and accurate operation, especially in complex environments where a global navigation satellite system cannot be used. Many researchers employ multiple sensor modalities, including cameras, LiDAR, and ultra-wideband (UWB), to achieve real-time state estimation. However, each sensor has specific requirements that might limit its usage. While LiDAR sensors demand a high payload capacity, camera sensors must have matching image features between robots, and UWB sensors require known fixed anchor locations for accurate positioning. This study introduces a robust localization system with a minimal sensor setup that eliminates the need for the previously mentioned requirements. We used an anchor-free UWB setup to establish a global coordinate system, unifying all robots. Each robot performs visual-inertial odometry to estimate its ego-motion in its local coordinate system. By optimizing the local odometry from each robot using inter-robot range measurements, the positions of the robots can be robustly estimated without relying on an extensive sensor setup or infrastructure. Our method offers a simple yet effective solution for achieving accurate and real-time multi-robot state estimation in challenging environments without relying on traditional sensor requirements.\",\"PeriodicalId\":38644,\"journal\":{\"name\":\"Journal of Institute of Control, Robotics and Systems\",\"volume\":\"36 2\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Institute of Control, Robotics and Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5302/j.icros.2023.23.0135\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Mathematics\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Institute of Control, Robotics and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5302/j.icros.2023.23.0135","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0

摘要

多机器人状态估计是实时准确操作的关键,特别是在无法使用全球卫星导航系统的复杂环境下。许多研究人员采用多种传感器模式,包括摄像头、激光雷达和超宽带(UWB),以实现实时状态估计。然而,每个传感器都有特定的要求,这可能会限制其使用。虽然激光雷达传感器需要高有效载荷能力,但相机传感器必须在机器人之间具有匹配的图像特征,而超宽带传感器需要已知的固定锚点位置以进行准确定位。本研究介绍了一种具有最小传感器设置的鲁棒定位系统,消除了前面提到的需求。我们使用无锚的超宽带设置来建立一个全局坐标系统,统一所有机器人。每个机器人执行视觉惯性里程计来估计其在局部坐标系中的自我运动。通过使用机器人之间的距离测量来优化每个机器人的局部里程表,可以在不依赖于广泛的传感器设置或基础设施的情况下稳健地估计机器人的位置。我们的方法提供了一种简单而有效的解决方案,可以在不依赖于传统传感器的情况下,在具有挑战性的环境中实现准确和实时的多机器人状态估计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Multi-unmanned Aerial Vehicle Pose Estimation Based on Visual-inertial-range Sensor Fusion
Multi-robot state estimation is crucial for real-time and accurate operation, especially in complex environments where a global navigation satellite system cannot be used. Many researchers employ multiple sensor modalities, including cameras, LiDAR, and ultra-wideband (UWB), to achieve real-time state estimation. However, each sensor has specific requirements that might limit its usage. While LiDAR sensors demand a high payload capacity, camera sensors must have matching image features between robots, and UWB sensors require known fixed anchor locations for accurate positioning. This study introduces a robust localization system with a minimal sensor setup that eliminates the need for the previously mentioned requirements. We used an anchor-free UWB setup to establish a global coordinate system, unifying all robots. Each robot performs visual-inertial odometry to estimate its ego-motion in its local coordinate system. By optimizing the local odometry from each robot using inter-robot range measurements, the positions of the robots can be robustly estimated without relying on an extensive sensor setup or infrastructure. Our method offers a simple yet effective solution for achieving accurate and real-time multi-robot state estimation in challenging environments without relying on traditional sensor requirements.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
1.50
自引率
0.00%
发文量
128
期刊最新文献
Proposal of MRFScore and a Regression Model for Identification of Music Relationship Indicator Mixed Reality-based Structure Placement Verification System Using AR Marker Optimal Parameter Estimation for Topological Descriptor Based Sonar Image Matching in Autonomous Underwater Robots 3D Space Object and Road Detection for Autonomous Vehicles Using Monocular Camera Images and Deep Learning Algorithms Optimization Methods for Non-linear Least Squares
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1