Nadia Hussien, Nadia Mahmood Hussien, Saba Abdulbaqi Salman, Mohammad Aljanabi
{"title":"基于同态加密模型的安全联邦学习","authors":"Nadia Hussien, Nadia Mahmood Hussien, Saba Abdulbaqi Salman, Mohammad Aljanabi","doi":"10.47667/ijpasr.v4i3.235","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) offers collaborative machine learning across decentralized devices while safeguarding data privacy. However, data security and privacy remain key concerns. This paper introduces \"Secure Federated Learning with a Homomorphic Encryption Model,\" addressing these challenges by integrating homomorphic encryption into FL. The model starts by initializing a global machine learning model and generating a homomorphic encryption key pair, with the public key shared among FL participants. Using this public key, participants then collect, preprocess, and encrypt their local data. During FL Training Rounds, participants decrypt the global model, compute local updates on encrypted data, encrypt these updates, and securely send them to the aggregator. The aggregator homomorphic ally combines updates without revealing participant data, forwarding the encrypted aggregated update to the global model owner. The Global Model Update ensures the owner decrypts the aggregated update using the private key, updates the global model, encrypts it with the public key, and shares the encrypted global model with FL participants. With optional model evaluation, training can iterate for several rounds or until convergence. This model offers a robust solution to Florida data privacy and security issues, with versatile applications across domains. This paper presents core model components, advantages, and potential domain-specific implementations while making significant strides in addressing FL's data privacy concerns.","PeriodicalId":14397,"journal":{"name":"International Journal Papier Advance and Scientific Review","volume":"131 3","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Secure Federated Learning with a Homomorphic Encryption Model\",\"authors\":\"Nadia Hussien, Nadia Mahmood Hussien, Saba Abdulbaqi Salman, Mohammad Aljanabi\",\"doi\":\"10.47667/ijpasr.v4i3.235\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning (FL) offers collaborative machine learning across decentralized devices while safeguarding data privacy. However, data security and privacy remain key concerns. This paper introduces \\\"Secure Federated Learning with a Homomorphic Encryption Model,\\\" addressing these challenges by integrating homomorphic encryption into FL. The model starts by initializing a global machine learning model and generating a homomorphic encryption key pair, with the public key shared among FL participants. Using this public key, participants then collect, preprocess, and encrypt their local data. During FL Training Rounds, participants decrypt the global model, compute local updates on encrypted data, encrypt these updates, and securely send them to the aggregator. The aggregator homomorphic ally combines updates without revealing participant data, forwarding the encrypted aggregated update to the global model owner. The Global Model Update ensures the owner decrypts the aggregated update using the private key, updates the global model, encrypts it with the public key, and shares the encrypted global model with FL participants. With optional model evaluation, training can iterate for several rounds or until convergence. This model offers a robust solution to Florida data privacy and security issues, with versatile applications across domains. This paper presents core model components, advantages, and potential domain-specific implementations while making significant strides in addressing FL's data privacy concerns.\",\"PeriodicalId\":14397,\"journal\":{\"name\":\"International Journal Papier Advance and Scientific Review\",\"volume\":\"131 3\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal Papier Advance and Scientific Review\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.47667/ijpasr.v4i3.235\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal Papier Advance and Scientific Review","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.47667/ijpasr.v4i3.235","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Secure Federated Learning with a Homomorphic Encryption Model
Federated learning (FL) offers collaborative machine learning across decentralized devices while safeguarding data privacy. However, data security and privacy remain key concerns. This paper introduces "Secure Federated Learning with a Homomorphic Encryption Model," addressing these challenges by integrating homomorphic encryption into FL. The model starts by initializing a global machine learning model and generating a homomorphic encryption key pair, with the public key shared among FL participants. Using this public key, participants then collect, preprocess, and encrypt their local data. During FL Training Rounds, participants decrypt the global model, compute local updates on encrypted data, encrypt these updates, and securely send them to the aggregator. The aggregator homomorphic ally combines updates without revealing participant data, forwarding the encrypted aggregated update to the global model owner. The Global Model Update ensures the owner decrypts the aggregated update using the private key, updates the global model, encrypts it with the public key, and shares the encrypted global model with FL participants. With optional model evaluation, training can iterate for several rounds or until convergence. This model offers a robust solution to Florida data privacy and security issues, with versatile applications across domains. This paper presents core model components, advantages, and potential domain-specific implementations while making significant strides in addressing FL's data privacy concerns.