{"title":"FedFR-ADP: Adaptive differential privacy with feedback regulation for robust model performance in federated learning","authors":"Debao Wang, Shaopeng Guan","doi":"10.1016/j.inffus.2024.102796","DOIUrl":null,"url":null,"abstract":"<div><div>Privacy preservation is a critical concern in Federated Learning (FL). However, traditional Local Differential Privacy (LDP) methods face challenges in balancing FL model accuracy with noise strength. To address this, we propose a novel adaptive differential privacy method with feedback regulation, FedFR-ADP. First, we employ Earth Mover’s Distance (EMD) to measure the data heterogeneity of each client and adaptively apply Gaussian noise based on the degree of heterogeneity, making the noise addition more targeted and effective. Second, we introduce a feedback regulation mechanism to dynamically tune the privacy budget according to the global model’s error feedback, further enhancing model performance. Finally, we validate our approach through experiments on two commonly used image classification datasets. The experimental results demonstrate that FedFR-ADP outperforms three benchmark algorithms, including DP-FedAvg, in terms of model training accuracy and Mean Squared Error (MSE) under varying degrees of heterogeneity. Compared to these benchmarks, FedFR-ADP achieves at least a 3.05% and 1.76% improvement in training accuracy across both datasets, with significantly reduced MSE fluctuations. This not only boosts model accuracy but also provides more stable noise control.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"116 ","pages":"Article 102796"},"PeriodicalIF":14.7000,"publicationDate":"2024-11-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253524005748","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Privacy preservation is a critical concern in Federated Learning (FL). However, traditional Local Differential Privacy (LDP) methods face challenges in balancing FL model accuracy with noise strength. To address this, we propose a novel adaptive differential privacy method with feedback regulation, FedFR-ADP. First, we employ Earth Mover’s Distance (EMD) to measure the data heterogeneity of each client and adaptively apply Gaussian noise based on the degree of heterogeneity, making the noise addition more targeted and effective. Second, we introduce a feedback regulation mechanism to dynamically tune the privacy budget according to the global model’s error feedback, further enhancing model performance. Finally, we validate our approach through experiments on two commonly used image classification datasets. The experimental results demonstrate that FedFR-ADP outperforms three benchmark algorithms, including DP-FedAvg, in terms of model training accuracy and Mean Squared Error (MSE) under varying degrees of heterogeneity. Compared to these benchmarks, FedFR-ADP achieves at least a 3.05% and 1.76% improvement in training accuracy across both datasets, with significantly reduced MSE fluctuations. This not only boosts model accuracy but also provides more stable noise control.
期刊介绍:
Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.