PlasticNet: A Memory Template Extractor with Plastic Neural Networks for Object Tracking

Yuyu Zhao, Shaowu Yang
{"title":"PlasticNet: A Memory Template Extractor with Plastic Neural Networks for Object Tracking","authors":"Yuyu Zhao, Shaowu Yang","doi":"10.1109/ICCRE51898.2021.9435660","DOIUrl":null,"url":null,"abstract":"Visual object tracking plays an important role in military guidance, human-computer interaction, and robot visual navigation. With the high performance and real-time speed, the Siamese network has become popular in recent years, which localizes the target by comparing the similarity of the appearance template and the candidate boxes in the search region. However, the appearance template is only extracted from the current frame, which results in the missing of the target for constantly changing appearance. In this paper, we propose a template extractor that can capture the latest appearance features from the previously predicted templates, named PlasticNet. We take inspiration from the memory mechanism of neuroscience (synaptic plasticity): the connections between neurons will be enhanced when they are stimulated at the same time. We combined it with recurrent networks to realize the PlasticNet. Our method can easily be integrated into existing siamese trackers. Our proposed model is applied in SiamRPN and improved performance. Extensive experiments on OTB2015, VOT2018, VOT2016 datasets demonstrate that our PlasticNet can effectively adapt to appearance changes.","PeriodicalId":382619,"journal":{"name":"2021 6th International Conference on Control and Robotics Engineering (ICCRE)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 6th International Conference on Control and Robotics Engineering (ICCRE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCRE51898.2021.9435660","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Visual object tracking plays an important role in military guidance, human-computer interaction, and robot visual navigation. With the high performance and real-time speed, the Siamese network has become popular in recent years, which localizes the target by comparing the similarity of the appearance template and the candidate boxes in the search region. However, the appearance template is only extracted from the current frame, which results in the missing of the target for constantly changing appearance. In this paper, we propose a template extractor that can capture the latest appearance features from the previously predicted templates, named PlasticNet. We take inspiration from the memory mechanism of neuroscience (synaptic plasticity): the connections between neurons will be enhanced when they are stimulated at the same time. We combined it with recurrent networks to realize the PlasticNet. Our method can easily be integrated into existing siamese trackers. Our proposed model is applied in SiamRPN and improved performance. Extensive experiments on OTB2015, VOT2018, VOT2016 datasets demonstrate that our PlasticNet can effectively adapt to appearance changes.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于塑料神经网络的记忆模板提取器
视觉目标跟踪在军事制导、人机交互、机器人视觉导航等领域发挥着重要作用。Siamese网络通过比较搜索区域内的候选框和外观模板的相似度来定位目标,具有高性能和实时性,近年来得到了广泛的应用。然而,外观模板仅从当前帧中提取,这导致了不断变化的外观缺少目标。在本文中,我们提出了一个模板提取器,可以从先前预测的模板中捕获最新的外观特征,命名为PlasticNet。我们从神经科学的记忆机制(突触可塑性)中获得灵感:神经元之间的连接在同时受到刺激时会增强。我们将其与循环网络相结合,实现了塑料网。我们的方法可以很容易地集成到现有的暹罗跟踪器中。该模型在SiamRPN中得到了应用,提高了性能。在OTB2015, VOT2018, VOT2016数据集上的大量实验表明,我们的PlasticNet可以有效地适应外观变化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Multiagent Motion Planning Based on Deep Reinforcement Learning in Complex Environments Design of an Unconventional Bionic Quadruped Robot with Low-degree-freedom of Movement Action Recognition Method for Multi-joint Industrial Robots Based on End-arm Vibration and BP Neural Network Multi-Mission Planning of Service Robot Based on L-HMM Obstacle Avoidable G2-continuous Trajectory Generated with Clothoid Spline Solution
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1