{"title":"A Novel Hierarchically Decentralized Federated Learning Framework in 6G Wireless Networks","authors":"J. Zhang, Li Chen, Xiaohui Chen, Guo Wei","doi":"10.1109/INFOCOMWKSHPS57453.2023.10226164","DOIUrl":null,"url":null,"abstract":"Decentralized federated learning (DFL) architecture enables clients to collaboratively train a shared machine learning model without a central parameter server. However, it is difficult to apply in multicell scenarios. In this paper, we propose an integrated hierarchically decentralized federated learning (HDFL) framework, where devices from different cells collaboratively train a global model under periodically intra-cell D2D consensus and inter-cell aggregation. We establish strong convergence guarantees for the proposed HDFL algorithm without assuming convex objectives. The convergence rate of HDFL can be optimized to achieve the balance of model accuracy and communication overhead. To improve the wireless performance of HDFL, we formulate an optimization problem to minimize the training latency and energy overhead. Numerical results based on the CIFAR-10 dataset validate the superiority of HDFL over traditional DFL methods in the multicell scenario.","PeriodicalId":354290,"journal":{"name":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)","volume":"26 13","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE INFOCOM 2023 - IEEE Conference on Computer Communications Workshops (INFOCOM WKSHPS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INFOCOMWKSHPS57453.2023.10226164","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Decentralized federated learning (DFL) architecture enables clients to collaboratively train a shared machine learning model without a central parameter server. However, it is difficult to apply in multicell scenarios. In this paper, we propose an integrated hierarchically decentralized federated learning (HDFL) framework, where devices from different cells collaboratively train a global model under periodically intra-cell D2D consensus and inter-cell aggregation. We establish strong convergence guarantees for the proposed HDFL algorithm without assuming convex objectives. The convergence rate of HDFL can be optimized to achieve the balance of model accuracy and communication overhead. To improve the wireless performance of HDFL, we formulate an optimization problem to minimize the training latency and energy overhead. Numerical results based on the CIFAR-10 dataset validate the superiority of HDFL over traditional DFL methods in the multicell scenario.