{"title":"基于Bregman优化的联邦学习隐私保护方法","authors":"Gengming Zhu, Jiyong Zhang, Shaobo Zhang, Yijie Yin","doi":"10.1109/CSCloud-EdgeCom58631.2023.00023","DOIUrl":null,"url":null,"abstract":"Federated learning has received a lot of attention for its ability to solve the data silo problem, but it is also limited by the problem of data heterogeneity and privacy. Non-Independent Identical Distribution (Non-I.I.D) data leads to performance degradation of federation models, and privacy problem have been studied as a hot topic in the field of federated learning. However, current research rarely considers non-I.I.D data and privacy simultaneously. In this paper, we propose a federated learning scheme based on Bregman and differential privacy (FLBDP). Our approach adopts Bregman distance for personalized model training, which aims to control the difference between local model and global model in a limited range, the FLBDP can reduce the model difference to improve the model performance by Bregman optimization. In addition, we use a Gaussian mechanism to perturb the personalized model and update the local model by the perturbed personalized model, which enables the model parameters to satisfy differential privacy in the uplink channel to enhance user privacy protection.","PeriodicalId":56007,"journal":{"name":"Journal of Cloud Computing-Advances Systems and Applications","volume":"43 1","pages":"85-90"},"PeriodicalIF":3.7000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Federated Learning Privacy-preserving Method Based on Bregman Optimization\",\"authors\":\"Gengming Zhu, Jiyong Zhang, Shaobo Zhang, Yijie Yin\",\"doi\":\"10.1109/CSCloud-EdgeCom58631.2023.00023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning has received a lot of attention for its ability to solve the data silo problem, but it is also limited by the problem of data heterogeneity and privacy. Non-Independent Identical Distribution (Non-I.I.D) data leads to performance degradation of federation models, and privacy problem have been studied as a hot topic in the field of federated learning. However, current research rarely considers non-I.I.D data and privacy simultaneously. In this paper, we propose a federated learning scheme based on Bregman and differential privacy (FLBDP). Our approach adopts Bregman distance for personalized model training, which aims to control the difference between local model and global model in a limited range, the FLBDP can reduce the model difference to improve the model performance by Bregman optimization. In addition, we use a Gaussian mechanism to perturb the personalized model and update the local model by the perturbed personalized model, which enables the model parameters to satisfy differential privacy in the uplink channel to enhance user privacy protection.\",\"PeriodicalId\":56007,\"journal\":{\"name\":\"Journal of Cloud Computing-Advances Systems and Applications\",\"volume\":\"43 1\",\"pages\":\"85-90\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2023-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Cloud Computing-Advances Systems and Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1109/CSCloud-EdgeCom58631.2023.00023\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Cloud Computing-Advances Systems and Applications","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/CSCloud-EdgeCom58631.2023.00023","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Federated Learning Privacy-preserving Method Based on Bregman Optimization
Federated learning has received a lot of attention for its ability to solve the data silo problem, but it is also limited by the problem of data heterogeneity and privacy. Non-Independent Identical Distribution (Non-I.I.D) data leads to performance degradation of federation models, and privacy problem have been studied as a hot topic in the field of federated learning. However, current research rarely considers non-I.I.D data and privacy simultaneously. In this paper, we propose a federated learning scheme based on Bregman and differential privacy (FLBDP). Our approach adopts Bregman distance for personalized model training, which aims to control the difference between local model and global model in a limited range, the FLBDP can reduce the model difference to improve the model performance by Bregman optimization. In addition, we use a Gaussian mechanism to perturb the personalized model and update the local model by the perturbed personalized model, which enables the model parameters to satisfy differential privacy in the uplink channel to enhance user privacy protection.
期刊介绍:
The Journal of Cloud Computing: Advances, Systems and Applications (JoCCASA) will publish research articles on all aspects of Cloud Computing. Principally, articles will address topics that are core to Cloud Computing, focusing on the Cloud applications, the Cloud systems, and the advances that will lead to the Clouds of the future. Comprehensive review and survey articles that offer up new insights, and lay the foundations for further exploratory and experimental work, are also relevant.