{"title":"PBFL: Communication-Efficient Federated Learning via Parameter Predicting","authors":"Kaiju Li;Chunhua Xiao","doi":"10.1093/comjnl/bxab184","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) is an emerging privacy-preserving technology for machine learning, which enables end devices to cooperatively train a global model without uploading their local sensitive data. Because of limited network bandwidth and considerable communication overhead, communication efficiency has become an essential bottleneck for FL. Existing solutions attempt to improve this situation by reducing communication rounds while usually come with more computation resource consumption or model accuracy deterioration. In this paper, we propose a parameter Prediction-Based DL (PBFL). In which an extended Kalman filter-based prediction algorithm, a practical prediction error threshold setting mechanism and an effective global model updating strategy are included. Instead of collecting all updates from participants, PBFL takes advantage of predicting values to aggregate the model, which substantially reduces required communication rounds while guaranteeing model accuracy. Inspired by the idea of prediction, each participant checks whether its prediction value is out of the tolerance threshold limits and only uploads local updates that have an inaccurate prediction value. In this way, no additional local computational resources are required. Experimental results on both multilayer perceptrons and convolutional neural networks show that PBFL outperforms the state-of-the-art methods and improves the communication efficiency by >66% with 1% higher model accuracy.","PeriodicalId":50641,"journal":{"name":"Computer Journal","volume":"66 3","pages":"626-642"},"PeriodicalIF":1.5000,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Journal","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10084364/","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 1
Abstract
Federated learning (FL) is an emerging privacy-preserving technology for machine learning, which enables end devices to cooperatively train a global model without uploading their local sensitive data. Because of limited network bandwidth and considerable communication overhead, communication efficiency has become an essential bottleneck for FL. Existing solutions attempt to improve this situation by reducing communication rounds while usually come with more computation resource consumption or model accuracy deterioration. In this paper, we propose a parameter Prediction-Based DL (PBFL). In which an extended Kalman filter-based prediction algorithm, a practical prediction error threshold setting mechanism and an effective global model updating strategy are included. Instead of collecting all updates from participants, PBFL takes advantage of predicting values to aggregate the model, which substantially reduces required communication rounds while guaranteeing model accuracy. Inspired by the idea of prediction, each participant checks whether its prediction value is out of the tolerance threshold limits and only uploads local updates that have an inaccurate prediction value. In this way, no additional local computational resources are required. Experimental results on both multilayer perceptrons and convolutional neural networks show that PBFL outperforms the state-of-the-art methods and improves the communication efficiency by >66% with 1% higher model accuracy.
期刊介绍:
The Computer Journal is one of the longest-established journals serving all branches of the academic computer science community. It is currently published in four sections.