Zhenxiao Zhang, Zhidong Gao, Yuanxiong Guo, Yanmin Gong
{"title":"Heterogeneity-Aware Cooperative Federated Edge Learning with Adaptive Computation and Communication Compression","authors":"Zhenxiao Zhang, Zhidong Gao, Yuanxiong Guo, Yanmin Gong","doi":"arxiv-2409.04022","DOIUrl":null,"url":null,"abstract":"Motivated by the drawbacks of cloud-based federated learning (FL),\ncooperative federated edge learning (CFEL) has been proposed to improve\nefficiency for FL over mobile edge networks, where multiple edge servers\ncollaboratively coordinate the distributed model training across a large number\nof edge devices. However, CFEL faces critical challenges arising from dynamic\nand heterogeneous device properties, which slow down the convergence and\nincrease resource consumption. This paper proposes a heterogeneity-aware CFEL\nscheme called \\textit{Heterogeneity-Aware Cooperative Edge-based Federated\nAveraging} (HCEF) that aims to maximize the model accuracy while minimizing the\ntraining time and energy consumption via adaptive computation and communication\ncompression in CFEL. By theoretically analyzing how local update frequency and\ngradient compression affect the convergence error bound in CFEL, we develop an\nefficient online control algorithm for HCEF to dynamically determine local\nupdate frequencies and compression ratios for heterogeneous devices.\nExperimental results show that compared with prior schemes, the proposed HCEF\nscheme can maintain higher model accuracy while reducing training latency and\nimproving energy efficiency simultaneously.","PeriodicalId":501422,"journal":{"name":"arXiv - CS - Distributed, Parallel, and Cluster Computing","volume":"19 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Distributed, Parallel, and Cluster Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.04022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Motivated by the drawbacks of cloud-based federated learning (FL),
cooperative federated edge learning (CFEL) has been proposed to improve
efficiency for FL over mobile edge networks, where multiple edge servers
collaboratively coordinate the distributed model training across a large number
of edge devices. However, CFEL faces critical challenges arising from dynamic
and heterogeneous device properties, which slow down the convergence and
increase resource consumption. This paper proposes a heterogeneity-aware CFEL
scheme called \textit{Heterogeneity-Aware Cooperative Edge-based Federated
Averaging} (HCEF) that aims to maximize the model accuracy while minimizing the
training time and energy consumption via adaptive computation and communication
compression in CFEL. By theoretically analyzing how local update frequency and
gradient compression affect the convergence error bound in CFEL, we develop an
efficient online control algorithm for HCEF to dynamically determine local
update frequencies and compression ratios for heterogeneous devices.
Experimental results show that compared with prior schemes, the proposed HCEF
scheme can maintain higher model accuracy while reducing training latency and
improving energy efficiency simultaneously.