Guozhen Peng , Yunhong Wang , Shaoxiong Zhang , Rui Li , Yuwei Zhao , Annan Li
{"title":"RSANet: Relative-sequence quality assessment network for gait recognition in the wild","authors":"Guozhen Peng , Yunhong Wang , Shaoxiong Zhang , Rui Li , Yuwei Zhao , Annan Li","doi":"10.1016/j.patcog.2024.111219","DOIUrl":null,"url":null,"abstract":"<div><div>Gait recognition in the wild has received increasing attention since the gait pattern is hard to disguise and can be captured in a long distance. However, due to occlusions and segmentation errors, low-quality silhouettes are common and inevitable. To mitigate this low-quality problem, some prior arts propose absolute-single quality assessment models. Although these methods obtain a good performance, they only focus on the silhouette quality of a single frame, lacking consideration of the variation state of the entire sequence. In this paper, we propose a Relative-Sequence Quality Assessment Network, named RSANet. It uses the Average Feature Similarity Module (AFSM) to evaluate silhouette quality by calculating the similarity between one silhouette and all other silhouettes in the same silhouette sequence. The silhouette quality is based on the sequence, reflecting a relative quality. Furthermore, RSANet uses Multi-Temporal-Receptive-Field Residual Blocks (MTB) to extend temporal receptive fields without parameter increases. It achieves a Rank-1 accuracy of 75.2% on Gait3D, 81.8% on GREW, and 77.6% on BUAA-Duke-Gait datasets respectively. The code is available at <span><span>https://github.com/PGZ-Sleepy/RSANet</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"161 ","pages":"Article 111219"},"PeriodicalIF":7.5000,"publicationDate":"2024-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320324009701","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Gait recognition in the wild has received increasing attention since the gait pattern is hard to disguise and can be captured in a long distance. However, due to occlusions and segmentation errors, low-quality silhouettes are common and inevitable. To mitigate this low-quality problem, some prior arts propose absolute-single quality assessment models. Although these methods obtain a good performance, they only focus on the silhouette quality of a single frame, lacking consideration of the variation state of the entire sequence. In this paper, we propose a Relative-Sequence Quality Assessment Network, named RSANet. It uses the Average Feature Similarity Module (AFSM) to evaluate silhouette quality by calculating the similarity between one silhouette and all other silhouettes in the same silhouette sequence. The silhouette quality is based on the sequence, reflecting a relative quality. Furthermore, RSANet uses Multi-Temporal-Receptive-Field Residual Blocks (MTB) to extend temporal receptive fields without parameter increases. It achieves a Rank-1 accuracy of 75.2% on Gait3D, 81.8% on GREW, and 77.6% on BUAA-Duke-Gait datasets respectively. The code is available at https://github.com/PGZ-Sleepy/RSANet.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.