{"title":"Protecting health monitoring privacy in fitness training: A federated learning framework based on personalized differential privacy","authors":"Lifang Shao","doi":"10.1002/itl2.499","DOIUrl":null,"url":null,"abstract":"The rapid advancement of health monitoring technologies has led to increased adoption of fitness training applications that collect and analyze personal health data. This paper presents a personalized differential privacy‐based federated learning (PDP‐FL) algorithm with two stages. Classifying the user's privacy according to their preferences is the first stage in achieving personalized privacy protection with the addition of noise. The privacy preference and the related privacy level are sent to the central aggregation server simultaneously. In the second stage, noise is added that conforms to the global differential privacy threshold based on the privacy level that users uploaded; this allows the global privacy protection level to be quantified while still adhering to the local and central protection strategies simultaneously adopted to realize the complete protection of global data. The results demonstrate the excellent classification accuracy of the proposed PDP‐FL algorithm. The proposed PDP‐FL algorithm addresses the critical issue of health monitoring privacy in fitness training applications. It ensures that sensitive data is handled responsibly and provides users the necessary tools to control their privacy settings. By achieving high classification accuracy while preserving privacy, the framework balances data utility and protection, thus positively impacting health monitoring ecosystem and medical systems.","PeriodicalId":509592,"journal":{"name":"Internet Technology Letters","volume":"69 18","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-01-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Internet Technology Letters","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/itl2.499","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The rapid advancement of health monitoring technologies has led to increased adoption of fitness training applications that collect and analyze personal health data. This paper presents a personalized differential privacy‐based federated learning (PDP‐FL) algorithm with two stages. Classifying the user's privacy according to their preferences is the first stage in achieving personalized privacy protection with the addition of noise. The privacy preference and the related privacy level are sent to the central aggregation server simultaneously. In the second stage, noise is added that conforms to the global differential privacy threshold based on the privacy level that users uploaded; this allows the global privacy protection level to be quantified while still adhering to the local and central protection strategies simultaneously adopted to realize the complete protection of global data. The results demonstrate the excellent classification accuracy of the proposed PDP‐FL algorithm. The proposed PDP‐FL algorithm addresses the critical issue of health monitoring privacy in fitness training applications. It ensures that sensitive data is handled responsibly and provides users the necessary tools to control their privacy settings. By achieving high classification accuracy while preserving privacy, the framework balances data utility and protection, thus positively impacting health monitoring ecosystem and medical systems.