{"title":"FACT: Feature Adaptive Continual-learning Tracker for Multiple Object Tracking","authors":"Rongzihan Song, Zhenyu Weng, Huiping Zhuang, Jinchang Ren, Yongming Chen, Zhiping Lin","doi":"arxiv-2409.07904","DOIUrl":null,"url":null,"abstract":"Multiple object tracking (MOT) involves identifying multiple targets and\nassigning them corresponding IDs within a video sequence, where occlusions are\noften encountered. Recent methods address occlusions using appearance cues\nthrough online learning techniques to improve adaptivity or offline learning\ntechniques to utilize temporal information from videos. However, most existing\nonline learning-based MOT methods are unable to learn from all past tracking\ninformation to improve adaptivity on long-term occlusions while maintaining\nreal-time tracking speed. On the other hand, temporal information-based offline\nlearning methods maintain a long-term memory to store past tracking\ninformation, but this approach restricts them to use only local past\ninformation during tracking. To address these challenges, we propose a new MOT\nframework called the Feature Adaptive Continual-learning Tracker (FACT), which\nenables real-time tracking and feature learning for targets by utilizing all\npast tracking information. We demonstrate that the framework can be integrated\nwith various state-of-the-art feature-based trackers, thereby improving their\ntracking ability. Specifically, we develop the feature adaptive\ncontinual-learning (FAC) module, a neural network that can be trained online to\nlearn features adaptively using all past tracking information during tracking.\nMoreover, we also introduce a two-stage association module specifically\ndesigned for the proposed continual learning-based tracking. Extensive\nexperiment results demonstrate that the proposed method achieves\nstate-of-the-art online tracking performance on MOT17 and MOT20 benchmarks. The\ncode will be released upon acceptance.","PeriodicalId":501130,"journal":{"name":"arXiv - CS - Computer Vision and Pattern Recognition","volume":"87 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computer Vision and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07904","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Multiple object tracking (MOT) involves identifying multiple targets and
assigning them corresponding IDs within a video sequence, where occlusions are
often encountered. Recent methods address occlusions using appearance cues
through online learning techniques to improve adaptivity or offline learning
techniques to utilize temporal information from videos. However, most existing
online learning-based MOT methods are unable to learn from all past tracking
information to improve adaptivity on long-term occlusions while maintaining
real-time tracking speed. On the other hand, temporal information-based offline
learning methods maintain a long-term memory to store past tracking
information, but this approach restricts them to use only local past
information during tracking. To address these challenges, we propose a new MOT
framework called the Feature Adaptive Continual-learning Tracker (FACT), which
enables real-time tracking and feature learning for targets by utilizing all
past tracking information. We demonstrate that the framework can be integrated
with various state-of-the-art feature-based trackers, thereby improving their
tracking ability. Specifically, we develop the feature adaptive
continual-learning (FAC) module, a neural network that can be trained online to
learn features adaptively using all past tracking information during tracking.
Moreover, we also introduce a two-stage association module specifically
designed for the proposed continual learning-based tracking. Extensive
experiment results demonstrate that the proposed method achieves
state-of-the-art online tracking performance on MOT17 and MOT20 benchmarks. The
code will be released upon acceptance.