{"title":"PFL-DKD: Modeling decoupled knowledge fusion with distillation for improving personalized federated learning","authors":"","doi":"10.1016/j.comnet.2024.110758","DOIUrl":null,"url":null,"abstract":"<div><p>We develop a novel framework for personalized federated learning (PFL) utilizing a decoupled version of knowledge distillation (DKD). Unlike traditional PFL methods, the proposed PFL-DKD creates a dynamically connected network among local clients and categorizes them according to their knowledge, storage, and computational capabilities. The developed decoupling of knowledge distillation into target class (TC) and latent class (LC) enables knowledge-rich clients to efficiently transfer their expertise to knowledge-poor clients. To further enhance our innovative PFL-DKD approach, we extend it to PFL-FDKD by introducing a ”logit fusion” that seamlessly aggregates knowledge and experiences from neighboring clients. Both our theoretical analyses and extensive experiments reveal that PFL-DKD outperforms existing centralized and decentralized PFL approaches, making significant strides in mitigating the challenges associated with heterogeneous data and system configurations. The details of our implementation with the codebase are in <span><span>PFL-DKD</span><svg><path></path></svg></span>.</p></div>","PeriodicalId":50637,"journal":{"name":"Computer Networks","volume":null,"pages":null},"PeriodicalIF":4.4000,"publicationDate":"2024-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1389128624005905","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0
Abstract
We develop a novel framework for personalized federated learning (PFL) utilizing a decoupled version of knowledge distillation (DKD). Unlike traditional PFL methods, the proposed PFL-DKD creates a dynamically connected network among local clients and categorizes them according to their knowledge, storage, and computational capabilities. The developed decoupling of knowledge distillation into target class (TC) and latent class (LC) enables knowledge-rich clients to efficiently transfer their expertise to knowledge-poor clients. To further enhance our innovative PFL-DKD approach, we extend it to PFL-FDKD by introducing a ”logit fusion” that seamlessly aggregates knowledge and experiences from neighboring clients. Both our theoretical analyses and extensive experiments reveal that PFL-DKD outperforms existing centralized and decentralized PFL approaches, making significant strides in mitigating the challenges associated with heterogeneous data and system configurations. The details of our implementation with the codebase are in PFL-DKD.
期刊介绍:
Computer Networks is an international, archival journal providing a publication vehicle for complete coverage of all topics of interest to those involved in the computer communications networking area. The audience includes researchers, managers and operators of networks as well as designers and implementors. The Editorial Board will consider any material for publication that is of interest to those groups.