{"title":"Bias to Balance: New-Knowledge-Preferred Few-Shot Class-Incremental Learning via Transition Calibration","authors":"Hongquan Zhang;Zhizhong Zhang;Xin Tan;Yanyun Qu;Yuan Xie","doi":"10.1109/TNNLS.2025.3550429","DOIUrl":null,"url":null,"abstract":"Humans can quickly learn new concepts with limited experience, while not forgetting learned knowledge. Such ability in machine learning is referred to as few-shot class-incremental learning (FSCIL). Although some methods try to solve this problem by putting similar efforts to prevent forgetting and promote learning, we find existing techniques do not give enough importance to the new category as new training samples are rather rare. In this article, we propose a new biased-to-unbiased rectification method, which introduces a trainable transition matrix to mitigate the prediction discrepancy between the old classes and the new classes. This transition matrix is to be diagonally dominated, normalized, and differentiable with new-knowledge-preferred prior, to solving the strong bias between heavy old knowledge and limited new knowledge. Hence, we can achieve a balanced solution between learning new concepts and preventing catastrophic forgetting by giving new classes more chances. Extensive experiments on miniImagenet, CIFAR100, and CUB200 demonstrate that our method outperforms the latest state-of-the-art methods by 1.1%, 1.44%, and 2.08%, respectively.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 8","pages":"15347-15358"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10970073/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Humans can quickly learn new concepts with limited experience, while not forgetting learned knowledge. Such ability in machine learning is referred to as few-shot class-incremental learning (FSCIL). Although some methods try to solve this problem by putting similar efforts to prevent forgetting and promote learning, we find existing techniques do not give enough importance to the new category as new training samples are rather rare. In this article, we propose a new biased-to-unbiased rectification method, which introduces a trainable transition matrix to mitigate the prediction discrepancy between the old classes and the new classes. This transition matrix is to be diagonally dominated, normalized, and differentiable with new-knowledge-preferred prior, to solving the strong bias between heavy old knowledge and limited new knowledge. Hence, we can achieve a balanced solution between learning new concepts and preventing catastrophic forgetting by giving new classes more chances. Extensive experiments on miniImagenet, CIFAR100, and CUB200 demonstrate that our method outperforms the latest state-of-the-art methods by 1.1%, 1.44%, and 2.08%, respectively.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.