{"title":"One Teacher is Enough: A Server-Clueless Federated Learning With Knowledge Distillation","authors":"Wanyi Ning;Qi Qi;Jingyu Wang;Mengde Zhu;Shaolong Li;Guang Yang;Jianxin Liao","doi":"10.1109/TSC.2024.3414372","DOIUrl":null,"url":null,"abstract":"Machine learning-based services offer intelligent solutions with powerful models. To enhance model robustness, Federated Learning (FL) emerges as a promising collaborative learning paradigm, which iteratively trains a global model through parameter exchange among multiple clients based on their local data. Generally, the local data are heterogeneous, which slows down convergence. Knowledge distillation is an effective technique against data heterogeneity while existing works distill the ensemble knowledge from local models, ignoring the natural global knowledge from the aggregated model. This places limitations on their algorithms, such as the need for proxy data or the necessary exposure of local models to the server, which is prohibited in most privacy-preserving FL with a clueless server. In this work, we propose FedDGT, a novel knowledge distillation method for industrial server-clueless FL. FedDGT regards the aggregated model as the only one teacher to impart its global knowledge into a generator and then regularizes the drifted local models through the generator, overcoming previous limitations and providing better privacy and scalability support. Extensive experiments demonstrate that FedDGT can achieve highly-competitive model performance while greatly reducing the communication rounds in a server-clueless scenario.","PeriodicalId":13255,"journal":{"name":"IEEE Transactions on Services Computing","volume":"17 5","pages":"2704-2718"},"PeriodicalIF":5.8000,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Services Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10556806/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Machine learning-based services offer intelligent solutions with powerful models. To enhance model robustness, Federated Learning (FL) emerges as a promising collaborative learning paradigm, which iteratively trains a global model through parameter exchange among multiple clients based on their local data. Generally, the local data are heterogeneous, which slows down convergence. Knowledge distillation is an effective technique against data heterogeneity while existing works distill the ensemble knowledge from local models, ignoring the natural global knowledge from the aggregated model. This places limitations on their algorithms, such as the need for proxy data or the necessary exposure of local models to the server, which is prohibited in most privacy-preserving FL with a clueless server. In this work, we propose FedDGT, a novel knowledge distillation method for industrial server-clueless FL. FedDGT regards the aggregated model as the only one teacher to impart its global knowledge into a generator and then regularizes the drifted local models through the generator, overcoming previous limitations and providing better privacy and scalability support. Extensive experiments demonstrate that FedDGT can achieve highly-competitive model performance while greatly reducing the communication rounds in a server-clueless scenario.
期刊介绍:
IEEE Transactions on Services Computing encompasses the computing and software aspects of the science and technology of services innovation research and development. It places emphasis on algorithmic, mathematical, statistical, and computational methods central to services computing. Topics covered include Service Oriented Architecture, Web Services, Business Process Integration, Solution Performance Management, and Services Operations and Management. The transactions address mathematical foundations, security, privacy, agreement, contract, discovery, negotiation, collaboration, and quality of service for web services. It also covers areas like composite web service creation, business and scientific applications, standards, utility models, business process modeling, integration, collaboration, and more in the realm of Services Computing.