FedACQ:联合学习中模型参数的自适应聚类量化

IF 2.5 Q2 COMPUTER SCIENCE, INFORMATION SYSTEMS International Journal of Web Information Systems Pub Date : 2023-11-28 DOI:10.1108/ijwis-08-2023-0128
Tingting Tian, Hongjian Shi, Ruhui Ma, Yuan Liu
{"title":"FedACQ:联合学习中模型参数的自适应聚类量化","authors":"Tingting Tian, Hongjian Shi, Ruhui Ma, Yuan Liu","doi":"10.1108/ijwis-08-2023-0128","DOIUrl":null,"url":null,"abstract":"Purpose For privacy protection, federated learning based on data separation allows machine learning models to be trained on remote devices or in isolated data devices. However, due to the limited resources such as bandwidth and power of local devices, communication in federated learning can be much slower than in local computing. This study aims to improve communication efficiency by reducing the number of communication rounds and the size of information transmitted in each round. Design/methodology/approach This paper allows each user node to perform multiple local trainings, then upload the local model parameters to a central server. The central server updates the global model parameters by weighted averaging the parameter information. Based on this aggregation, user nodes first cluster the parameter information to be uploaded and then replace each value with the mean value of its cluster. Considering the asymmetry of the federated learning framework, adaptively select the optimal number of clusters required to compress the model information. Findings While maintaining the loss convergence rate similar to that of federated averaging, the test accuracy did not decrease significantly. Originality/value By compressing uplink traffic, the work can improve communication efficiency on dynamic networks with limited resources.","PeriodicalId":44153,"journal":{"name":"International Journal of Web Information Systems","volume":"8 1","pages":""},"PeriodicalIF":2.5000,"publicationDate":"2023-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FedACQ: adaptive clustering quantization of model parameters in federated learning\",\"authors\":\"Tingting Tian, Hongjian Shi, Ruhui Ma, Yuan Liu\",\"doi\":\"10.1108/ijwis-08-2023-0128\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Purpose For privacy protection, federated learning based on data separation allows machine learning models to be trained on remote devices or in isolated data devices. However, due to the limited resources such as bandwidth and power of local devices, communication in federated learning can be much slower than in local computing. This study aims to improve communication efficiency by reducing the number of communication rounds and the size of information transmitted in each round. Design/methodology/approach This paper allows each user node to perform multiple local trainings, then upload the local model parameters to a central server. The central server updates the global model parameters by weighted averaging the parameter information. Based on this aggregation, user nodes first cluster the parameter information to be uploaded and then replace each value with the mean value of its cluster. Considering the asymmetry of the federated learning framework, adaptively select the optimal number of clusters required to compress the model information. Findings While maintaining the loss convergence rate similar to that of federated averaging, the test accuracy did not decrease significantly. Originality/value By compressing uplink traffic, the work can improve communication efficiency on dynamic networks with limited resources.\",\"PeriodicalId\":44153,\"journal\":{\"name\":\"International Journal of Web Information Systems\",\"volume\":\"8 1\",\"pages\":\"\"},\"PeriodicalIF\":2.5000,\"publicationDate\":\"2023-11-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Web Information Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1108/ijwis-08-2023-0128\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Web Information Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/ijwis-08-2023-0128","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

目的 为了保护隐私,基于数据分离的联合学习允许在远程设备或隔离的数据设备上训练机器学习模型。然而,由于本地设备的带宽和功率等资源有限,联合学习中的通信可能比本地计算慢得多。本研究旨在通过减少通信轮数和每轮传输的信息量来提高通信效率。 设计/方法/途径 本文允许每个用户节点执行多次本地训练,然后将本地模型参数上传到中央服务器。中央服务器通过加权平均参数信息来更新全局模型参数。在此基础上,用户节点首先对要上传的参数信息进行聚类,然后用其聚类的平均值替换每个值。考虑到联合学习框架的不对称性,自适应地选择压缩模型信息所需的最优簇数。 研究结果 在保持与联合平均法相似的损失收敛速度的同时,测试准确率没有明显下降。 独创性/价值 通过压缩上行链路流量,这项工作可以提高资源有限的动态网络的通信效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
FedACQ: adaptive clustering quantization of model parameters in federated learning
Purpose For privacy protection, federated learning based on data separation allows machine learning models to be trained on remote devices or in isolated data devices. However, due to the limited resources such as bandwidth and power of local devices, communication in federated learning can be much slower than in local computing. This study aims to improve communication efficiency by reducing the number of communication rounds and the size of information transmitted in each round. Design/methodology/approach This paper allows each user node to perform multiple local trainings, then upload the local model parameters to a central server. The central server updates the global model parameters by weighted averaging the parameter information. Based on this aggregation, user nodes first cluster the parameter information to be uploaded and then replace each value with the mean value of its cluster. Considering the asymmetry of the federated learning framework, adaptively select the optimal number of clusters required to compress the model information. Findings While maintaining the loss convergence rate similar to that of federated averaging, the test accuracy did not decrease significantly. Originality/value By compressing uplink traffic, the work can improve communication efficiency on dynamic networks with limited resources.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
International Journal of Web Information Systems
International Journal of Web Information Systems COMPUTER SCIENCE, INFORMATION SYSTEMS-
CiteScore
4.60
自引率
0.00%
发文量
19
期刊介绍: The Global Information Infrastructure is a daily reality. In spite of the many applications in all domains of our societies: e-business, e-commerce, e-learning, e-science, and e-government, for instance, and in spite of the tremendous advances by engineers and scientists, the seamless development of Web information systems and services remains a major challenge. The journal examines how current shared vision for the future is one of semantically-rich information and service oriented architecture for global information systems. This vision is at the convergence of progress in technologies such as XML, Web services, RDF, OWL, of multimedia, multimodal, and multilingual information retrieval, and of distributed, mobile and ubiquitous computing. Topicality While the International Journal of Web Information Systems covers a broad range of topics, the journal welcomes papers that provide a perspective on all aspects of Web information systems: Web semantics and Web dynamics, Web mining and searching, Web databases and Web data integration, Web-based commerce and e-business, Web collaboration and distributed computing, Internet computing and networks, performance of Web applications, and Web multimedia services and Web-based education.
期刊最新文献
ImageNet classification with Raspberry Pis: federated learning algorithms of local classifiers A review of in-memory computing for machine learning: architectures, options Efficient knowledge distillation for remote sensing image classification: a CNN-based approach FedACQ: adaptive clustering quantization of model parameters in federated learning A systematic literature review of authorization and access control requirements and current state of the art for different database models
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1