Knowledge Distillation by Multiple Student Instance Interaction

Tian Ni, Haoji Hu
{"title":"Knowledge Distillation by Multiple Student Instance Interaction","authors":"Tian Ni, Haoji Hu","doi":"10.1109/prmvia58252.2023.00038","DOIUrl":null,"url":null,"abstract":"Knowledge distillation is an efficient method in neural network compression, which transfers the knowledge from a high-capacity teacher network to a low-capacity student network. Previous approaches follow the ‘one teacher and one student’ paradigm, which neglects the possibility that interaction of multiple students could boost the distillation performance. In this paper, we propose a novel approach by simultaneously training multiple instances of a student model. By adding the similarity and diversity losses into the baseline knowledge distillation and adaptively adjusting the proportion of these losses according to accuracy changes of multiple student instances, we build a distillation system to make students collaborate and compete with each other, which improves system robustness and performance. Experiments show superior performance of the proposed method over existing offline and online distillation schemes on datasets with various scales.","PeriodicalId":221346,"journal":{"name":"2023 International Conference on Pattern Recognition, Machine Vision and Intelligent Algorithms (PRMVIA)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Pattern Recognition, Machine Vision and Intelligent Algorithms (PRMVIA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/prmvia58252.2023.00038","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Knowledge distillation is an efficient method in neural network compression, which transfers the knowledge from a high-capacity teacher network to a low-capacity student network. Previous approaches follow the ‘one teacher and one student’ paradigm, which neglects the possibility that interaction of multiple students could boost the distillation performance. In this paper, we propose a novel approach by simultaneously training multiple instances of a student model. By adding the similarity and diversity losses into the baseline knowledge distillation and adaptively adjusting the proportion of these losses according to accuracy changes of multiple student instances, we build a distillation system to make students collaborate and compete with each other, which improves system robustness and performance. Experiments show superior performance of the proposed method over existing offline and online distillation schemes on datasets with various scales.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于多学生实例交互的知识提炼
知识蒸馏是神经网络压缩中的一种有效方法,它将高容量的教师网络中的知识转移到低容量的学生网络中。以前的方法遵循“一个老师和一个学生”的范式,忽略了多个学生的互动可以提高蒸馏性能的可能性。在本文中,我们提出了一种新的方法,即同时训练一个学生模型的多个实例。通过将相似性和多样性损失加入到基线知识蒸馏中,并根据多个学生实例的精度变化自适应调整这些损失的比例,构建了一个使学生相互协作和竞争的蒸馏系统,提高了系统的鲁棒性和性能。实验结果表明,在不同尺度的数据集上,该方法比现有的离线和在线蒸馏方案具有更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Surface deformation monitoring based on DINSAR technique Sigma-UAP: An Invisible Semi-Universal Adversarial Attack Against Deep Neural Networks Lightweight defect detection method of punched nickel-plated steel strip based on GhostNet Performance Analysis of CHAID Algorithm for Accuracy Garbage Classification and Detection Based on Improved YOLOv7 Network
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1