联合半监督图像分类的集合知识提炼

IF 6.6 1区 计算机科学 Q1 Multidisciplinary Tsinghua Science and Technology Pub Date : 2024-09-11 DOI:10.26599/TST.2023.9010156
Ertong Shang;Hui Liu;Jingyang Zhang;Runqi Zhao;Junzhao Du
{"title":"联合半监督图像分类的集合知识提炼","authors":"Ertong Shang;Hui Liu;Jingyang Zhang;Runqi Zhao;Junzhao Du","doi":"10.26599/TST.2023.9010156","DOIUrl":null,"url":null,"abstract":"Federated learning is an emerging privacy-preserving distributed learning paradigm, in which many clients collaboratively train a shared global model under the orchestration of a remote server. Most current works on federated learning have focused on fully supervised learning settings, assuming that all the data are annotated with ground-truth labels. However, this work considers a more realistic and challenging setting, Federated Semi-Supervised Learning (FSSL), where clients have a large amount of unlabeled data and only the server hosts a small number of labeled samples. How to reasonably utilize the server-side labeled data and the client-side unlabeled data is the core challenge in this setting. In this paper, we propose a new FSSL algorithm for image classification based on consistency regularization and ensemble knowledge distillation, called EKDFSSL. Our algorithm uses the global model as the teacher in consistency regularization methods to enhance both the accuracy and stability of client-side unsupervised learning on unlabeled data. Besides, we introduce an additional ensemble knowledge distillation loss to mitigate model overfitting during server-side retraining on labeled data. Extensive experiments on several image classification datasets show that our EKDFSSL outperforms current baseline methods.","PeriodicalId":48690,"journal":{"name":"Tsinghua Science and Technology","volume":"30 1","pages":"112-123"},"PeriodicalIF":6.6000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10676340","citationCount":"0","resultStr":"{\"title\":\"Ensemble Knowledge Distillation for Federated Semi-Supervised Image Classification\",\"authors\":\"Ertong Shang;Hui Liu;Jingyang Zhang;Runqi Zhao;Junzhao Du\",\"doi\":\"10.26599/TST.2023.9010156\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning is an emerging privacy-preserving distributed learning paradigm, in which many clients collaboratively train a shared global model under the orchestration of a remote server. Most current works on federated learning have focused on fully supervised learning settings, assuming that all the data are annotated with ground-truth labels. However, this work considers a more realistic and challenging setting, Federated Semi-Supervised Learning (FSSL), where clients have a large amount of unlabeled data and only the server hosts a small number of labeled samples. How to reasonably utilize the server-side labeled data and the client-side unlabeled data is the core challenge in this setting. In this paper, we propose a new FSSL algorithm for image classification based on consistency regularization and ensemble knowledge distillation, called EKDFSSL. Our algorithm uses the global model as the teacher in consistency regularization methods to enhance both the accuracy and stability of client-side unsupervised learning on unlabeled data. Besides, we introduce an additional ensemble knowledge distillation loss to mitigate model overfitting during server-side retraining on labeled data. Extensive experiments on several image classification datasets show that our EKDFSSL outperforms current baseline methods.\",\"PeriodicalId\":48690,\"journal\":{\"name\":\"Tsinghua Science and Technology\",\"volume\":\"30 1\",\"pages\":\"112-123\"},\"PeriodicalIF\":6.6000,\"publicationDate\":\"2024-09-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10676340\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Tsinghua Science and Technology\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10676340/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Multidisciplinary\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Tsinghua Science and Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10676340/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Multidisciplinary","Score":null,"Total":0}
引用次数: 0

摘要

联盟学习是一种新兴的保护隐私的分布式学习模式,在这种模式中,许多客户端在远程服务器的协调下协作训练一个共享的全局模型。目前大多数关于联合学习的研究都集中在完全监督的学习环境中,假设所有数据都标注了地面真实标签。然而,这项工作考虑的是一种更现实、更具挑战性的环境,即联合半监督学习(FSSL),在这种环境下,客户端拥有大量未标注的数据,而服务器只托管少量已标注的样本。如何合理利用服务器端的标签数据和客户端的非标签数据是这种情况下的核心挑战。本文提出了一种新的基于一致性正则化和集合知识提炼的图像分类 FSSL 算法,称为 EKDFSSL。我们的算法使用一致性正则化方法中的全局模型作为教师,以提高客户端无监督学习在无标记数据上的准确性和稳定性。此外,我们还引入了额外的集合知识蒸馏损失,以减轻服务器端在标注数据上进行再训练时的模型过拟合。在多个图像分类数据集上的广泛实验表明,我们的 EKDFSSL 优于当前的基线方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Ensemble Knowledge Distillation for Federated Semi-Supervised Image Classification
Federated learning is an emerging privacy-preserving distributed learning paradigm, in which many clients collaboratively train a shared global model under the orchestration of a remote server. Most current works on federated learning have focused on fully supervised learning settings, assuming that all the data are annotated with ground-truth labels. However, this work considers a more realistic and challenging setting, Federated Semi-Supervised Learning (FSSL), where clients have a large amount of unlabeled data and only the server hosts a small number of labeled samples. How to reasonably utilize the server-side labeled data and the client-side unlabeled data is the core challenge in this setting. In this paper, we propose a new FSSL algorithm for image classification based on consistency regularization and ensemble knowledge distillation, called EKDFSSL. Our algorithm uses the global model as the teacher in consistency regularization methods to enhance both the accuracy and stability of client-side unsupervised learning on unlabeled data. Besides, we introduce an additional ensemble knowledge distillation loss to mitigate model overfitting during server-side retraining on labeled data. Extensive experiments on several image classification datasets show that our EKDFSSL outperforms current baseline methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Tsinghua Science and Technology
Tsinghua Science and Technology COMPUTER SCIENCE, INFORMATION SYSTEMSCOMPU-COMPUTER SCIENCE, SOFTWARE ENGINEERING
CiteScore
10.20
自引率
10.60%
发文量
2340
期刊介绍: Tsinghua Science and Technology (Tsinghua Sci Technol) started publication in 1996. It is an international academic journal sponsored by Tsinghua University and is published bimonthly. This journal aims at presenting the up-to-date scientific achievements in computer science, electronic engineering, and other IT fields. Contributions all over the world are welcome.
期刊最新文献
Contents Front Cover LP-Rounding Based Algorithm for Capacitated Uniform Facility Location Problem with Soft Penalties A P4-Based Approach to Traffic Isolation and Bandwidth Management for 5G Network Slicing Quantum-Inspired Sensitive Data Measurement and Secure Transmission in 5G-Enabled Healthcare Systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1