Target–distractor memory joint tracking algorithm via Credit Allocation Network

IF 2.4 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Machine Vision and Applications Pub Date : 2024-02-09 DOI:10.1007/s00138-024-01508-4
Huanlong Zhang, Panyun Wang, Zhiwu Chen, Jie Zhang, Linwei Li
{"title":"Target–distractor memory joint tracking algorithm via Credit Allocation Network","authors":"Huanlong Zhang, Panyun Wang, Zhiwu Chen, Jie Zhang, Linwei Li","doi":"10.1007/s00138-024-01508-4","DOIUrl":null,"url":null,"abstract":"<p>The tracking framework based on the memory network has gained significant attention due to its enhanced adaptability to variations in target appearance. However, the performance of the framework is limited by the negative effects of distractors in the background. Hence, this paper proposes a method for tracking using Credit Allocation Network to join target and distractor memory. Specifically, we design a Credit Allocation Network (CAN) that is updated online via Guided Focus Loss. The CAN produces credit scores for tracking results by learning features of the target object, ensuring the update of reliable samples for storage in the memory pool. Furthermore, we construct a multi-domain memory model that simultaneously captures target and background information from multiple historical intervals, which can build a more compatible object appearance model while increasing the diversity of the memory sample. Moreover, a novel target–distractor joint localization strategy is presented, which read target and distractor information from memory frames based on cross-attention, so as to cancel out wrong responses in the target response map by using the distractor response map. The experimental results on OTB-2015, GOT-10k, UAV123, LaSOT, and VOT-2018 datasets show the competitiveness and effectiveness of the proposed method compared to other trackers.</p>","PeriodicalId":51116,"journal":{"name":"Machine Vision and Applications","volume":null,"pages":null},"PeriodicalIF":2.4000,"publicationDate":"2024-02-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine Vision and Applications","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s00138-024-01508-4","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

The tracking framework based on the memory network has gained significant attention due to its enhanced adaptability to variations in target appearance. However, the performance of the framework is limited by the negative effects of distractors in the background. Hence, this paper proposes a method for tracking using Credit Allocation Network to join target and distractor memory. Specifically, we design a Credit Allocation Network (CAN) that is updated online via Guided Focus Loss. The CAN produces credit scores for tracking results by learning features of the target object, ensuring the update of reliable samples for storage in the memory pool. Furthermore, we construct a multi-domain memory model that simultaneously captures target and background information from multiple historical intervals, which can build a more compatible object appearance model while increasing the diversity of the memory sample. Moreover, a novel target–distractor joint localization strategy is presented, which read target and distractor information from memory frames based on cross-attention, so as to cancel out wrong responses in the target response map by using the distractor response map. The experimental results on OTB-2015, GOT-10k, UAV123, LaSOT, and VOT-2018 datasets show the competitiveness and effectiveness of the proposed method compared to other trackers.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过信用分配网络实现的目标-脱离者记忆联合跟踪算法
基于记忆网络的跟踪框架因其对目标外观变化的适应性更强而备受关注。然而,该框架的性能受到背景中干扰物负面影响的限制。因此,本文提出了一种使用信用分配网络(Credit Allocation Network)来连接目标和干扰记忆的跟踪方法。具体来说,我们设计了一个信用分配网络(CAN),该网络通过 "引导焦点丢失"(Guided Focus Loss)进行在线更新。该网络通过学习目标对象的特征,为跟踪结果生成信用分数,确保更新可靠的样本以存储在记忆池中。此外,我们还构建了一个多域记忆模型,可同时捕捉来自多个历史时间间隔的目标和背景信息,从而在增加记忆样本多样性的同时,建立一个兼容性更强的物体外观模型。此外,我们还提出了一种新颖的目标-分心联合定位策略,该策略基于交叉注意从记忆帧中读取目标和分心信息,从而利用分心响应图抵消目标响应图中的错误响应。在 OTB-2015、GOT-10k、UAV123、LaSOT 和 VOT-2018 数据集上的实验结果表明,与其他跟踪器相比,所提出的方法具有竞争力和有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Machine Vision and Applications
Machine Vision and Applications 工程技术-工程:电子与电气
CiteScore
6.30
自引率
3.00%
发文量
84
审稿时长
8.7 months
期刊介绍: Machine Vision and Applications publishes high-quality technical contributions in machine vision research and development. Specifically, the editors encourage submittals in all applications and engineering aspects of image-related computing. In particular, original contributions dealing with scientific, commercial, industrial, military, and biomedical applications of machine vision, are all within the scope of the journal. Particular emphasis is placed on engineering and technology aspects of image processing and computer vision. The following aspects of machine vision applications are of interest: algorithms, architectures, VLSI implementations, AI techniques and expert systems for machine vision, front-end sensing, multidimensional and multisensor machine vision, real-time techniques, image databases, virtual reality and visualization. Papers must include a significant experimental validation component.
期刊最新文献
A novel key point based ROI segmentation and image captioning using guidance information Specular Surface Detection with Deep Static Specular Flow and Highlight Removing cloud shadows from ground-based solar imagery Underwater image object detection based on multi-scale feature fusion Object Recognition Consistency in Regression for Active Detection
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1