FlyTracker:使用事件相机的无人机运动跟踪和障碍物检测

Yue Wu, Jingao Xu, Danyang Li, Yadong Xie, Hao Cao, Fan Li, Zheng Yang
{"title":"FlyTracker:使用事件相机的无人机运动跟踪和障碍物检测","authors":"Yue Wu, Jingao Xu, Danyang Li, Yadong Xie, Hao Cao, Fan Li, Zheng Yang","doi":"10.1109/INFOCOM53939.2023.10228976","DOIUrl":null,"url":null,"abstract":"Location awareness in environments is one of the key parts for drones’ applications and have been explored through various visual sensors. However, standard cameras easily suffer from motion blur under high moving speeds and low-quality image under poor illumination, which brings challenges for drones to perform motion tracking. Recently, a kind of bio-inspired sensors called event cameras emerge, offering advantages like high temporal resolution, high dynamic range and low latency, which motivate us to explore their potential to perform motion tracking in limited scenarios. In this paper, we propose FlyTracker, aiming at developing visual sensing ability for drones of both individual and circumambient location-relevant contextual, by using a monocular event camera. In FlyTracker, background-subtraction-based method is proposed to distinguish moving objects from background and fusion-based photometric features are carefully designed to obtain motion information. Through multilevel fusion of events and images, which are heterogeneous visual data, FlyTracker can effectively and reliably track the 6-DoF pose of the drone as well as monitor relative positions of moving obstacles. We evaluate performance of FlyTracker in different environments and the results show that FlyTracker is more accurate than the state-of-the-art baselines.","PeriodicalId":387707,"journal":{"name":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications","volume":"84 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FlyTracker: Motion Tracking and Obstacle Detection for Drones Using Event Cameras\",\"authors\":\"Yue Wu, Jingao Xu, Danyang Li, Yadong Xie, Hao Cao, Fan Li, Zheng Yang\",\"doi\":\"10.1109/INFOCOM53939.2023.10228976\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Location awareness in environments is one of the key parts for drones’ applications and have been explored through various visual sensors. However, standard cameras easily suffer from motion blur under high moving speeds and low-quality image under poor illumination, which brings challenges for drones to perform motion tracking. Recently, a kind of bio-inspired sensors called event cameras emerge, offering advantages like high temporal resolution, high dynamic range and low latency, which motivate us to explore their potential to perform motion tracking in limited scenarios. In this paper, we propose FlyTracker, aiming at developing visual sensing ability for drones of both individual and circumambient location-relevant contextual, by using a monocular event camera. In FlyTracker, background-subtraction-based method is proposed to distinguish moving objects from background and fusion-based photometric features are carefully designed to obtain motion information. Through multilevel fusion of events and images, which are heterogeneous visual data, FlyTracker can effectively and reliably track the 6-DoF pose of the drone as well as monitor relative positions of moving obstacles. We evaluate performance of FlyTracker in different environments and the results show that FlyTracker is more accurate than the state-of-the-art baselines.\",\"PeriodicalId\":387707,\"journal\":{\"name\":\"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications\",\"volume\":\"84 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/INFOCOM53939.2023.10228976\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INFOCOM53939.2023.10228976","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

环境中的位置感知是无人机应用的关键部分之一,已经通过各种视觉传感器进行了探索。然而,标准摄像机在高运动速度下容易出现运动模糊,在光照不足的情况下容易出现低质量图像,这给无人机的运动跟踪带来了挑战。最近,一种被称为事件相机的生物传感器出现了,它具有高时间分辨率、高动态范围和低延迟等优点,这促使我们探索它们在有限场景下执行运动跟踪的潜力。在本文中,我们提出FlyTracker,旨在开发无人机的视觉感知能力,无论是个人和周围的位置相关的上下文,通过使用单目事件相机。在FlyTracker中,提出了基于背景差的方法来区分运动目标和背景,并精心设计了基于融合的光度特征来获取运动信息。FlyTracker通过对异构视觉数据事件和图像进行多层次融合,有效可靠地跟踪无人机的6自由度姿态,并监测移动障碍物的相对位置。我们在不同的环境中评估了FlyTracker的性能,结果表明FlyTracker比最先进的基线更准确。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
FlyTracker: Motion Tracking and Obstacle Detection for Drones Using Event Cameras
Location awareness in environments is one of the key parts for drones’ applications and have been explored through various visual sensors. However, standard cameras easily suffer from motion blur under high moving speeds and low-quality image under poor illumination, which brings challenges for drones to perform motion tracking. Recently, a kind of bio-inspired sensors called event cameras emerge, offering advantages like high temporal resolution, high dynamic range and low latency, which motivate us to explore their potential to perform motion tracking in limited scenarios. In this paper, we propose FlyTracker, aiming at developing visual sensing ability for drones of both individual and circumambient location-relevant contextual, by using a monocular event camera. In FlyTracker, background-subtraction-based method is proposed to distinguish moving objects from background and fusion-based photometric features are carefully designed to obtain motion information. Through multilevel fusion of events and images, which are heterogeneous visual data, FlyTracker can effectively and reliably track the 6-DoF pose of the drone as well as monitor relative positions of moving obstacles. We evaluate performance of FlyTracker in different environments and the results show that FlyTracker is more accurate than the state-of-the-art baselines.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
i-NVMe: Isolated NVMe over TCP for a Containerized Environment One Shot for All: Quick and Accurate Data Aggregation for LPWANs Joint Participation Incentive and Network Pricing Design for Federated Learning Buffer Awareness Neural Adaptive Video Streaming for Avoiding Extra Buffer Consumption Melody: Toward Resource-Efficient Packet Header Vector Encoding on Programmable Switches
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1