Few-shot class-incremental learning (FSCIL) confronts dual challenges of significant overfitting and catastrophic forgetting. Recent prototype-based methods typically obtain the prototypes by averaging feature embeddings. However, due to the data scarcity and the heterogeneity in feature distribution of new classes, existing prototypes often deviate from the theoretical optimums, resulting in compromised generalization ability. In this work, we address the FSCIL problem from two aspects. First, we introduce covariance matrices to serve as prototypes, which effectively address the heterogeneity of feature distributions. The novel prototypes improve the representation of intricate class structure effectively by capturing the covariance relationships between high-dimensional features, and thus enhancing generalization ability. Second, a novel three-stage FSCIL framework is proposed to address the limited data problem. The framework includes a generator training stage, where a difference distribution generator is trained with a reference pair set and a generator training set derived from the base training dataset. Then, in the incremental learning stage, the pseudo-samples produced by the generator are combined with real samples to calculate the covariance prototypes and classify test samples using the Mahalanobis distance. Experiments on CIFAR-100, CUB-200, and miniImageNet show that the proposed method can effectively contribute to performance enhancement in prototype-based approaches.
扫码关注我们
求助内容:
应助结果提醒方式:
