Class Incremental Learning for Visual Task using Knowledge Distillation

Usman Tahir, Amanullah Yasin, Ahmad Jalal
{"title":"Class Incremental Learning for Visual Task using Knowledge Distillation","authors":"Usman Tahir, Amanullah Yasin, Ahmad Jalal","doi":"10.1109/INMIC56986.2022.9972924","DOIUrl":null,"url":null,"abstract":"The Artificial Agent's ability to enhance knowledge incrementally for new data is challenging in class incremental learning because of catastrophic forgetting in which new classes make the trained model quickly forget old classes knowledge. Knowledge distilling techniques and keeping subset of data from the old classes have been proposed to revamp models to accommodate new classes. These techniques allow models to sustain their knowledge without forgetting everything they already know but somewhat alleviate the catastrophic forgetting problem. In this study we propose class incremental learning using bi-distillation (CILBD) method that effectively learn not only the classes of the new data but also previously learned classes. The proposed architecture uses knowledge distillation in such a way that the student model directly learns knowledge from two teacher model and thus alleviate the forgetting of the old class. Our experiments on the iCIFAR-100 dataset showed that the proposed method is more accurate at classifying, forgets less, and works better than state-of-the-art methods.","PeriodicalId":404424,"journal":{"name":"2022 24th International Multitopic Conference (INMIC)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 24th International Multitopic Conference (INMIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INMIC56986.2022.9972924","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The Artificial Agent's ability to enhance knowledge incrementally for new data is challenging in class incremental learning because of catastrophic forgetting in which new classes make the trained model quickly forget old classes knowledge. Knowledge distilling techniques and keeping subset of data from the old classes have been proposed to revamp models to accommodate new classes. These techniques allow models to sustain their knowledge without forgetting everything they already know but somewhat alleviate the catastrophic forgetting problem. In this study we propose class incremental learning using bi-distillation (CILBD) method that effectively learn not only the classes of the new data but also previously learned classes. The proposed architecture uses knowledge distillation in such a way that the student model directly learns knowledge from two teacher model and thus alleviate the forgetting of the old class. Our experiments on the iCIFAR-100 dataset showed that the proposed method is more accurate at classifying, forgets less, and works better than state-of-the-art methods.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于知识蒸馏的视觉任务类增量学习
在类增量学习中,人工智能体对新数据进行增量式知识增强的能力是一个挑战,因为在灾难性遗忘中,新的类会使被训练的模型迅速忘记旧的类知识。提出了知识提取技术和保留旧类的数据子集来改进模型以适应新类。这些技术允许模型维持他们的知识,而不会忘记他们已经知道的一切,但在某种程度上减轻了灾难性的遗忘问题。在这项研究中,我们提出了使用双蒸馏(CILBD)方法的类增量学习,该方法不仅有效地学习新数据的类,而且有效地学习以前学习过的类。该体系结构采用知识蒸馏的方法,使学生模型直接从两个教师模型中学习知识,从而减轻了对旧课堂的遗忘。我们在iCIFAR-100数据集上的实验表明,所提出的方法在分类方面更准确,遗忘更少,并且比最先进的方法更好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Recognition of Faces Wearing Masks Using Skip Connection Based Dense Units Augmented With Self Restrained Triplet Loss Enhancing NDVI Calculation of Low-Resolution Imagery using ESRGANs Device Interoperability for Industrial IoT using Model-Driven Architecture Multi-Organ Plant Classification Using Deep Learning A Systematic Review on Fully Automated Online Exam Proctoring Approaches
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1