{"title":"通过适应性实例关联蒸馏实现低分辨率人脸识别","authors":"Ruixin Shi, Weijia Guo, Shiming Ge","doi":"arxiv-2409.02049","DOIUrl":null,"url":null,"abstract":"Low-resolution face recognition is a challenging task due to the missing of\ninformative details. Recent approaches based on knowledge distillation have\nproven that high-resolution clues can well guide low-resolution face\nrecognition via proper knowledge transfer. However, due to the distribution\ndifference between training and testing faces, the learned models often suffer\nfrom poor adaptability. To address that, we split the knowledge transfer\nprocess into distillation and adaptation steps, and propose an adaptable\ninstance-relation distillation approach to facilitate low-resolution face\nrecognition. In the approach, the student distills knowledge from\nhigh-resolution teacher in both instance level and relation level, providing\nsufficient cross-resolution knowledge transfer. Then, the learned student can\nbe adaptable to recognize low-resolution faces with adaptive batch\nnormalization in inference. In this manner, the capability of recovering\nmissing details of familiar low-resolution faces can be effectively enhanced,\nleading to a better knowledge transfer. Extensive experiments on low-resolution\nface recognition clearly demonstrate the effectiveness and adaptability of our\napproach.","PeriodicalId":501480,"journal":{"name":"arXiv - CS - Multimedia","volume":"6 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation\",\"authors\":\"Ruixin Shi, Weijia Guo, Shiming Ge\",\"doi\":\"arxiv-2409.02049\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Low-resolution face recognition is a challenging task due to the missing of\\ninformative details. Recent approaches based on knowledge distillation have\\nproven that high-resolution clues can well guide low-resolution face\\nrecognition via proper knowledge transfer. However, due to the distribution\\ndifference between training and testing faces, the learned models often suffer\\nfrom poor adaptability. To address that, we split the knowledge transfer\\nprocess into distillation and adaptation steps, and propose an adaptable\\ninstance-relation distillation approach to facilitate low-resolution face\\nrecognition. In the approach, the student distills knowledge from\\nhigh-resolution teacher in both instance level and relation level, providing\\nsufficient cross-resolution knowledge transfer. Then, the learned student can\\nbe adaptable to recognize low-resolution faces with adaptive batch\\nnormalization in inference. In this manner, the capability of recovering\\nmissing details of familiar low-resolution faces can be effectively enhanced,\\nleading to a better knowledge transfer. Extensive experiments on low-resolution\\nface recognition clearly demonstrate the effectiveness and adaptability of our\\napproach.\",\"PeriodicalId\":501480,\"journal\":{\"name\":\"arXiv - CS - Multimedia\",\"volume\":\"6 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Multimedia\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.02049\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Multimedia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.02049","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Low-Resolution Face Recognition via Adaptable Instance-Relation Distillation
Low-resolution face recognition is a challenging task due to the missing of
informative details. Recent approaches based on knowledge distillation have
proven that high-resolution clues can well guide low-resolution face
recognition via proper knowledge transfer. However, due to the distribution
difference between training and testing faces, the learned models often suffer
from poor adaptability. To address that, we split the knowledge transfer
process into distillation and adaptation steps, and propose an adaptable
instance-relation distillation approach to facilitate low-resolution face
recognition. In the approach, the student distills knowledge from
high-resolution teacher in both instance level and relation level, providing
sufficient cross-resolution knowledge transfer. Then, the learned student can
be adaptable to recognize low-resolution faces with adaptive batch
normalization in inference. In this manner, the capability of recovering
missing details of familiar low-resolution faces can be effectively enhanced,
leading to a better knowledge transfer. Extensive experiments on low-resolution
face recognition clearly demonstrate the effectiveness and adaptability of our
approach.