基于元学习在线校正的轻量级目标跟踪算法

IF 2.6 4区 计算机科学 Q2 COMPUTER SCIENCE, INFORMATION SYSTEMS Journal of Visual Communication and Image Representation Pub Date : 2024-08-01 DOI:10.1016/j.jvcir.2024.104228
Yongsheng Qi, Guohua Yin, Yongting Li, Liqiang Liu, Zhengting Jiang
{"title":"基于元学习在线校正的轻量级目标跟踪算法","authors":"Yongsheng Qi,&nbsp;Guohua Yin,&nbsp;Yongting Li,&nbsp;Liqiang Liu,&nbsp;Zhengting Jiang","doi":"10.1016/j.jvcir.2024.104228","DOIUrl":null,"url":null,"abstract":"<div><p>The traditional Siamese network based object tracking algorithms suffer from high computational complexity, making them difficult to run on embedded devices. Moreover, when faced with long-term tracking tasks, their success rates significantly decline. To address these issues, we propose a lightweight long-term object tracking algorithm called Meta-Master-based Ghost Fast Tracking (MGTtracker),which based on meta-learning. This algorithm integrates the Ghost mechanism to create a lightweight backbone network called G-ResNet, which accurately extracts target features while operating quickly. We design a tiny adaptive weighted fusion feature pyramid network (TiFPN) to enhance feature information fusion and mitigate interference from similar objects. We introduce a lightweight region regression network, the Ghost Decouple Net (GDNet) for target position prediction. Finally, we propose a meta-learning-based online template correction mechanism called Meta-Master to overcome error accumulation in long-term tracking tasks and the difficulty of reacquiring targets after loss. We evaluate the algorithm on public datasets OTB100, VOT2020, VOT2018LT, and LaSOT and deploy it for performance testing on Jetson Xavier NX. Experimental results demonstrate the effectiveness and superiority of the algorithm. Compared to existing classic object tracking algorithms, our approach achieves a faster running speed of 25 FPS on NX, and real-time correction enhances the algorithm’s robustness. Although similar in accuracy and EAO metrics, our algorithm outperforms similar algorithms in speed and effectively addresses the issues of significant cumulative errors and easy target loss during tracking. Code is released at <span><span>https://github.com/ygh96521/MGTtracker.git</span><svg><path></path></svg></span>.</p></div>","PeriodicalId":54755,"journal":{"name":"Journal of Visual Communication and Image Representation","volume":"103 ","pages":"Article 104228"},"PeriodicalIF":2.6000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A lightweight target tracking algorithm based on online correction for meta-learning\",\"authors\":\"Yongsheng Qi,&nbsp;Guohua Yin,&nbsp;Yongting Li,&nbsp;Liqiang Liu,&nbsp;Zhengting Jiang\",\"doi\":\"10.1016/j.jvcir.2024.104228\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>The traditional Siamese network based object tracking algorithms suffer from high computational complexity, making them difficult to run on embedded devices. Moreover, when faced with long-term tracking tasks, their success rates significantly decline. To address these issues, we propose a lightweight long-term object tracking algorithm called Meta-Master-based Ghost Fast Tracking (MGTtracker),which based on meta-learning. This algorithm integrates the Ghost mechanism to create a lightweight backbone network called G-ResNet, which accurately extracts target features while operating quickly. We design a tiny adaptive weighted fusion feature pyramid network (TiFPN) to enhance feature information fusion and mitigate interference from similar objects. We introduce a lightweight region regression network, the Ghost Decouple Net (GDNet) for target position prediction. Finally, we propose a meta-learning-based online template correction mechanism called Meta-Master to overcome error accumulation in long-term tracking tasks and the difficulty of reacquiring targets after loss. We evaluate the algorithm on public datasets OTB100, VOT2020, VOT2018LT, and LaSOT and deploy it for performance testing on Jetson Xavier NX. Experimental results demonstrate the effectiveness and superiority of the algorithm. Compared to existing classic object tracking algorithms, our approach achieves a faster running speed of 25 FPS on NX, and real-time correction enhances the algorithm’s robustness. Although similar in accuracy and EAO metrics, our algorithm outperforms similar algorithms in speed and effectively addresses the issues of significant cumulative errors and easy target loss during tracking. Code is released at <span><span>https://github.com/ygh96521/MGTtracker.git</span><svg><path></path></svg></span>.</p></div>\",\"PeriodicalId\":54755,\"journal\":{\"name\":\"Journal of Visual Communication and Image Representation\",\"volume\":\"103 \",\"pages\":\"Article 104228\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Visual Communication and Image Representation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1047320324001846\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Visual Communication and Image Representation","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1047320324001846","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

传统的基于连体网络的物体跟踪算法计算复杂度高,难以在嵌入式设备上运行。此外,当面临长期跟踪任务时,它们的成功率会明显下降。为了解决这些问题,我们提出了一种基于元学习的轻量级长期目标跟踪算法,称为基于元主控的幽灵快速跟踪(MGTtracker)。该算法整合了 Ghost 机制,创建了一个名为 G-ResNet 的轻量级骨干网络,可以准确提取目标特征,同时快速运行。我们设计了一个微小的自适应加权融合特征金字塔网络(TiFPN),以增强特征信息融合并减轻相似物体的干扰。我们引入了轻量级区域回归网络--Ghost Decouple Net (GDNet),用于目标位置预测。最后,我们提出了一种基于元学习的在线模板校正机制,称为 Meta-Master,以克服长期跟踪任务中的误差积累和丢失后重新获取目标的困难。我们在公共数据集 OTB100、VOT2020、VOT2018LT 和 LaSOT 上对算法进行了评估,并在 Jetson Xavier NX 上进行了性能测试。实验结果证明了该算法的有效性和优越性。与现有的经典物体跟踪算法相比,我们的方法在 NX 上的运行速度更快,达到 25 FPS,而且实时校正增强了算法的鲁棒性。虽然精度和 EAO 指标相似,但我们的算法在速度上优于同类算法,并有效解决了跟踪过程中累积误差大和目标易丢失的问题。代码发布于 https://github.com/ygh96521/MGTtracker.git。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A lightweight target tracking algorithm based on online correction for meta-learning

The traditional Siamese network based object tracking algorithms suffer from high computational complexity, making them difficult to run on embedded devices. Moreover, when faced with long-term tracking tasks, their success rates significantly decline. To address these issues, we propose a lightweight long-term object tracking algorithm called Meta-Master-based Ghost Fast Tracking (MGTtracker),which based on meta-learning. This algorithm integrates the Ghost mechanism to create a lightweight backbone network called G-ResNet, which accurately extracts target features while operating quickly. We design a tiny adaptive weighted fusion feature pyramid network (TiFPN) to enhance feature information fusion and mitigate interference from similar objects. We introduce a lightweight region regression network, the Ghost Decouple Net (GDNet) for target position prediction. Finally, we propose a meta-learning-based online template correction mechanism called Meta-Master to overcome error accumulation in long-term tracking tasks and the difficulty of reacquiring targets after loss. We evaluate the algorithm on public datasets OTB100, VOT2020, VOT2018LT, and LaSOT and deploy it for performance testing on Jetson Xavier NX. Experimental results demonstrate the effectiveness and superiority of the algorithm. Compared to existing classic object tracking algorithms, our approach achieves a faster running speed of 25 FPS on NX, and real-time correction enhances the algorithm’s robustness. Although similar in accuracy and EAO metrics, our algorithm outperforms similar algorithms in speed and effectively addresses the issues of significant cumulative errors and easy target loss during tracking. Code is released at https://github.com/ygh96521/MGTtracker.git.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Visual Communication and Image Representation
Journal of Visual Communication and Image Representation 工程技术-计算机:软件工程
CiteScore
5.40
自引率
11.50%
发文量
188
审稿时长
9.9 months
期刊介绍: The Journal of Visual Communication and Image Representation publishes papers on state-of-the-art visual communication and image representation, with emphasis on novel technologies and theoretical work in this multidisciplinary area of pure and applied research. The field of visual communication and image representation is considered in its broadest sense and covers both digital and analog aspects as well as processing and communication in biological visual systems.
期刊最新文献
Multi-level similarity transfer and adaptive fusion data augmentation for few-shot object detection Color image watermarking using vector SNCM-HMT A memory access number constraint-based string prediction technique for high throughput SCC implemented in AVS3 Faster-slow network fused with enhanced fine-grained features for action recognition Lightweight macro-pixel quality enhancement network for light field images compressed by versatile video coding
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1