VSG: Visual Servo Based Geolocalization for Long-Range Target in Outdoor Environment

IF 14 1区 工程技术 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Intelligent Vehicles Pub Date : 2024-03-08 DOI:10.1109/TIV.2024.3373696
Yang Liu;Zhihao Sun;Xueyi Wang;Zheng Fan;Xiangyang Wang;Lele Zhang;Hailing Fu;Fang Deng
{"title":"VSG: Visual Servo Based Geolocalization for Long-Range Target in Outdoor Environment","authors":"Yang Liu;Zhihao Sun;Xueyi Wang;Zheng Fan;Xiangyang Wang;Lele Zhang;Hailing Fu;Fang Deng","doi":"10.1109/TIV.2024.3373696","DOIUrl":null,"url":null,"abstract":"Long-range target geolocalization in outdoor complex environments has been a long-term challenge in intelligent transportation and autonomous vehicles with great interest in fields of vehicle detection, monitoring, and security. However, since traditional monocular or binocular geolocalization methods are typically implemented by depth estimation or parallax computation, suffering from large errors when targets are far away, and thus hard to be directly applied to outdoor environments. In this paper, we propose a visual servo-based global geolocalization system, namely VSG, which takes the target position information in the binocular camera images as the control signals, automatically solves the global positions according to the gimbal rotation angles. This system solves the problem of long-range static and dynamic target geolocalization (ranging from 220 m to 1200 m), and localizes the farthest target of 1223.8 m with only 3.5\n<inline-formula><tex-math>$\\%$</tex-math></inline-formula>\n localization error. VSG also realizes full-process automation by combining the deep learning-based objection detection, and its localization performance has been proved by series of experiments. This system is the longest-range global geolocalization method with preferred accuracy reported so far, and can be deployed in different geomorphology with great robustness.","PeriodicalId":36532,"journal":{"name":"IEEE Transactions on Intelligent Vehicles","volume":"9 3","pages":"4504-4517"},"PeriodicalIF":14.0000,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Intelligent Vehicles","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10460151/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Long-range target geolocalization in outdoor complex environments has been a long-term challenge in intelligent transportation and autonomous vehicles with great interest in fields of vehicle detection, monitoring, and security. However, since traditional monocular or binocular geolocalization methods are typically implemented by depth estimation or parallax computation, suffering from large errors when targets are far away, and thus hard to be directly applied to outdoor environments. In this paper, we propose a visual servo-based global geolocalization system, namely VSG, which takes the target position information in the binocular camera images as the control signals, automatically solves the global positions according to the gimbal rotation angles. This system solves the problem of long-range static and dynamic target geolocalization (ranging from 220 m to 1200 m), and localizes the farthest target of 1223.8 m with only 3.5 $\%$ localization error. VSG also realizes full-process automation by combining the deep learning-based objection detection, and its localization performance has been proved by series of experiments. This system is the longest-range global geolocalization method with preferred accuracy reported so far, and can be deployed in different geomorphology with great robustness.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
VSG:基于视觉伺服的室外环境远距离目标地理定位系统
室外复杂环境中的远距离目标地理定位一直是智能交通和自动驾驶汽车领域的长期挑战,在车辆检测、监控和安全领域具有重大意义。然而,由于传统的单目或双目地理定位方法通常通过深度估计或视差计算来实现,当目标距离较远时误差较大,因此难以直接应用于室外环境。本文提出了一种基于视觉伺服的全局地理定位系统,即 VSG,它以双目相机图像中的目标位置信息为控制信号,根据云台旋转角度自动求解全局位置。该系统解决了远距离静态和动态目标地理定位问题(范围从 220 米到 1200 米),定位最远目标 1223.8 米,定位误差仅为 3.5%。VSG 还结合了基于深度学习的异议检测,实现了全流程自动化,其定位性能得到了一系列实验的验证。该系统是迄今为止所报道的精度最优的长距离全球地理定位方法,可在不同地貌中部署,具有很强的鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Intelligent Vehicles
IEEE Transactions on Intelligent Vehicles Mathematics-Control and Optimization
CiteScore
12.10
自引率
13.40%
发文量
177
期刊介绍: The IEEE Transactions on Intelligent Vehicles (T-IV) is a premier platform for publishing peer-reviewed articles that present innovative research concepts, application results, significant theoretical findings, and application case studies in the field of intelligent vehicles. With a particular emphasis on automated vehicles within roadway environments, T-IV aims to raise awareness of pressing research and application challenges. Our focus is on providing critical information to the intelligent vehicle community, serving as a dissemination vehicle for IEEE ITS Society members and others interested in learning about the state-of-the-art developments and progress in research and applications related to intelligent vehicles. Join us in advancing knowledge and innovation in this dynamic field.
期刊最新文献
Table of Contents Introducing IEEE Collabratec The Autonomous Right of Way: Smart Governance for Smart Mobility With Intelligent Vehicles TechRxiv: Share Your Preprint Research with the World! Blank
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1