Be your own doctor: Temperature scaling self-knowledge distillation for medical image classification

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neurocomputing Pub Date : 2025-07-14 Epub Date: 2025-04-05 DOI:10.1016/j.neucom.2025.130115
Wenjie Liu, Lei Zhang, Xianliang Zhang, Xinyang Zhou, Xin Wei
{"title":"Be your own doctor: Temperature scaling self-knowledge distillation for medical image classification","authors":"Wenjie Liu,&nbsp;Lei Zhang,&nbsp;Xianliang Zhang,&nbsp;Xinyang Zhou,&nbsp;Xin Wei","doi":"10.1016/j.neucom.2025.130115","DOIUrl":null,"url":null,"abstract":"<div><div>Self-knowledge distillation (self-KD), which uses the student network as the teacher model, allows the model to learn knowledge by itself. It has been widely studied in various medical image tasks for constructing lightweight models to alleviate the limitations of computing resources. However, existing self-KD methods use a single temperature for distillation, ignoring the effect of temperature on different classes. In this paper, we investigate the effects of target class temperature and non-target class temperature on the performance of self-KD. Based on the above study, a temperature scaling self-knowledge distillation (TSS-KD) model is proposed, which can better balance the target class knowledge and non-target class knowledge. By adjusting the temperature scaling of different classes, the model can learn better representations by distilling the well-proportioned features. To make the network focus more on the local lesions of medical images, a regional gamma augmentation (RGA) method is proposed, which provides stronger perturbations to the same sample to generate more differentiated features. By self-regularizing the consistency of these features, the model can learn more local knowledge. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on nine medical image classification tasks of eight public datasets. Experimental results show that the proposed method outperforms state-of-the-art self-KD models and has strong generality. The code is available at <span><span>https://github.com/JeaneyLau/TSS-KD</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"638 ","pages":"Article 130115"},"PeriodicalIF":6.5000,"publicationDate":"2025-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225007878","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/4/5 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Self-knowledge distillation (self-KD), which uses the student network as the teacher model, allows the model to learn knowledge by itself. It has been widely studied in various medical image tasks for constructing lightweight models to alleviate the limitations of computing resources. However, existing self-KD methods use a single temperature for distillation, ignoring the effect of temperature on different classes. In this paper, we investigate the effects of target class temperature and non-target class temperature on the performance of self-KD. Based on the above study, a temperature scaling self-knowledge distillation (TSS-KD) model is proposed, which can better balance the target class knowledge and non-target class knowledge. By adjusting the temperature scaling of different classes, the model can learn better representations by distilling the well-proportioned features. To make the network focus more on the local lesions of medical images, a regional gamma augmentation (RGA) method is proposed, which provides stronger perturbations to the same sample to generate more differentiated features. By self-regularizing the consistency of these features, the model can learn more local knowledge. To evaluate the effectiveness of the proposed method, extensive experiments are conducted on nine medical image classification tasks of eight public datasets. Experimental results show that the proposed method outperforms state-of-the-art self-KD models and has strong generality. The code is available at https://github.com/JeaneyLau/TSS-KD.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
做自己的医生:温度标度自我知识蒸馏用于医学图像分类
自我知识蒸馏(Self-knowledge distillation, self-KD),以学生网络为教师模型,使模型能够自行学习知识。在各种医学图像任务中,构建轻量级模型以缓解计算资源的限制已经得到了广泛的研究。然而,现有的自kd方法使用单一温度进行蒸馏,忽略了温度对不同类别的影响。本文研究了靶类温度和非靶类温度对自kd性能的影响。在此基础上,提出了一种温度缩放自知识蒸馏(TSS-KD)模型,该模型能够更好地平衡目标类知识和非目标类知识。通过调整不同类别的温度缩放,模型可以通过提取比例均匀的特征来学习更好的表示。为了使网络更关注医学图像的局部病变,提出了一种区域伽马增强(regional gamma augmentation, RGA)方法,该方法对同一样本提供更强的扰动,以产生更多的差异化特征。通过对这些特征的一致性进行自正则化,模型可以学习到更多的局部知识。为了评估该方法的有效性,在8个公开数据集的9个医学图像分类任务上进行了大量的实验。实验结果表明,该方法优于目前最先进的自kd模型,具有较强的通用性。代码可在https://github.com/JeaneyLau/TSS-KD上获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
期刊最新文献
SSFA-Net: Sparse strip and dual-domain spatial-frequency attention for efficient image dehazing Analysis of adaptive optimal control theory for nonzero-sum stackelberg game based on high-order neural networks and conjugate gradient method Blind confusion of classification networks: A black box evaluation under common and structured image corruptions GeoDiffuser: A geometry-aware extension of pretrained diffusion models for consistent multi-view synthesis DCAF: Dynamic affective consistency-aware fusion with disentangled modality representations for multimodal sentiment analysis
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1