{"title":"Enhancing the Generalization Performance of Few-Shot Image Classification with Self-Knowledge Distillation","authors":"Liang Li, Weidong Jin, Yingkun Huang, Junxiao Ren","doi":"10.24846/v31i2y202207","DOIUrl":null,"url":null,"abstract":": Though deep learning has succeeded in various fields, its performance on tasks without a large-scale dataset is always unsatisfactory. The meta-learning based few-shot learning has been used to address the limited data situation. Because of its fast adaptation to the new concepts, meta-learning fully utilizes the prior transferrable knowledge to recognize the unseen instances. The general belief is that meta-learning leverages a large quantity of few-shot tasks sampled from the base dataset to quickly adapt the learner to an unseen task. In this paper, the teacher model is distilled to transfer the features using the same architecture. Following the standard-setting in few-shot learning, the proposed model was trained from scratch and the distribution was transferred to a better generalization. Feature similarity matching was proposed to compensate for the inner feature similarities. Besides, the prediction from the teacher model was further corrected in the self-knowledge distillation period. The proposed approach was evaluated on several commonly used benchmarks in few-shot learning and performed best among all prior works.","PeriodicalId":49466,"journal":{"name":"Studies in Informatics and Control","volume":" ","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2022-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Studies in Informatics and Control","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.24846/v31i2y202207","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
: Though deep learning has succeeded in various fields, its performance on tasks without a large-scale dataset is always unsatisfactory. The meta-learning based few-shot learning has been used to address the limited data situation. Because of its fast adaptation to the new concepts, meta-learning fully utilizes the prior transferrable knowledge to recognize the unseen instances. The general belief is that meta-learning leverages a large quantity of few-shot tasks sampled from the base dataset to quickly adapt the learner to an unseen task. In this paper, the teacher model is distilled to transfer the features using the same architecture. Following the standard-setting in few-shot learning, the proposed model was trained from scratch and the distribution was transferred to a better generalization. Feature similarity matching was proposed to compensate for the inner feature similarities. Besides, the prediction from the teacher model was further corrected in the self-knowledge distillation period. The proposed approach was evaluated on several commonly used benchmarks in few-shot learning and performed best among all prior works.
期刊介绍:
Studies in Informatics and Control journal provides important perspectives on topics relevant to Information Technology, with an emphasis on useful applications in the most important areas of IT.
This journal is aimed at advanced practitioners and researchers in the field of IT and welcomes original contributions from scholars and professionals worldwide.
SIC is published both in print and online by the National Institute for R&D in Informatics, ICI Bucharest. Abstracts, full text and graphics of all articles in the online version of SIC are identical to the print version of the Journal.