Personalized Federated Learning with Gradient Similarity

Jing Xie, Xiang Yin, Xiyi Zhang, Juan Chen, Q. Wen
{"title":"Personalized Federated Learning with Gradient Similarity","authors":"Jing Xie, Xiang Yin, Xiyi Zhang, Juan Chen, Q. Wen","doi":"10.1109/ICCWAMTIP53232.2021.9674055","DOIUrl":null,"url":null,"abstract":"In the conventional federated learning, the local models of multiple clients are trained independently by their privacy data, and the center server generates the shared global model by aggregating local models. However, the global model often fails to adapt to each client due to statistical heterogeneities, such as non-IID data. To address the problem, we propose the Subclass Personalized Federated Learning (SPFL) algorithm for non-IID data. In SPFL, the server uses the Softmax Normalized Gradient Similarity (SNGS) to weight the relationship between clients, and sends the personalized global model to each client. The stage strategy of ResNet is also applied to improve the performance of our algorithm. The experimental results show that the SPFL algorithm used on non-IID data outperforms the vanilla FedAvg, Per-FedAvg, FedUpdate, and pFedMe algorithms, improving the accuracy by 1.81∼18.46% on four datasets (CIFAR10, CIFAR100, MNIST, EMNIST), while still maintaining the state-of-the-art performance on IID data.","PeriodicalId":358772,"journal":{"name":"2021 18th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 18th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCWAMTIP53232.2021.9674055","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In the conventional federated learning, the local models of multiple clients are trained independently by their privacy data, and the center server generates the shared global model by aggregating local models. However, the global model often fails to adapt to each client due to statistical heterogeneities, such as non-IID data. To address the problem, we propose the Subclass Personalized Federated Learning (SPFL) algorithm for non-IID data. In SPFL, the server uses the Softmax Normalized Gradient Similarity (SNGS) to weight the relationship between clients, and sends the personalized global model to each client. The stage strategy of ResNet is also applied to improve the performance of our algorithm. The experimental results show that the SPFL algorithm used on non-IID data outperforms the vanilla FedAvg, Per-FedAvg, FedUpdate, and pFedMe algorithms, improving the accuracy by 1.81∼18.46% on four datasets (CIFAR10, CIFAR100, MNIST, EMNIST), while still maintaining the state-of-the-art performance on IID data.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于梯度相似度的个性化联邦学习
在传统的联邦学习中,多个客户端的局部模型由其隐私数据独立训练,中心服务器通过聚合局部模型生成共享的全局模型。然而,由于统计异质性,例如非iid数据,全局模型往往不能适应每个客户端。为了解决这个问题,我们提出了针对非iid数据的子类个性化联邦学习(SPFL)算法。在SPFL中,服务器使用Softmax归一化梯度相似度(SNGS)来加权客户端之间的关系,并将个性化的全局模型发送给每个客户端。采用ResNet的分级策略提高了算法的性能。实验结果表明,在非IID数据上使用的SPFL算法优于传统的fedag、per - fedag、feduupdate和pFedMe算法,在四个数据集(CIFAR10、CIFAR100、MNIST、EMNIST)上提高了1.81 ~ 18.46%的准确率,同时在IID数据上仍然保持了最先进的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Joint Modulation and Coding Recognition Using Deep Learning Chinese Short Text Classification Based On Deep Learning Solving TPS by SA Based on Probabilistic Double Crossover Operator Personalized Federated Learning with Gradient Similarity Implicit Certificate Based Signcryption for a Secure Data Sharing in Clouds
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1