{"title":"联邦学习中高效的全局训练方法","authors":"D. M. S. Bhatti, Haewoon Nam","doi":"10.1109/ICAIIC57133.2023.10066985","DOIUrl":null,"url":null,"abstract":"Federated learning is a novel approach of training the global model on the server by utilizing the personal data of the end users while data privacy is preserved. The users called clients are required to perform the local training using their local datasets and forward those trained local models to the server, in which the local models are aggregated to update the global model. This process of global training is carried out for several rounds until the convergence. Practically, the clients' data is non-independent and identically distributed (Non-IID). Hence, the updated local model of each client may vary from every other client due to heterogeneity among them. Hence, the process of aggregating the diversified local models of clients has a huge impact on the performance of global training. This article proposes a performance efficient aggregation approach for federated learning, which considers the data heterogeneity among clients before aggregating the received local models. The proposed approach is compared with the conventional federated learning methods, and it achieves improved performance.","PeriodicalId":105769,"journal":{"name":"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A Performance Efficient Approach of Global Training in Federated Learning\",\"authors\":\"D. M. S. Bhatti, Haewoon Nam\",\"doi\":\"10.1109/ICAIIC57133.2023.10066985\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning is a novel approach of training the global model on the server by utilizing the personal data of the end users while data privacy is preserved. The users called clients are required to perform the local training using their local datasets and forward those trained local models to the server, in which the local models are aggregated to update the global model. This process of global training is carried out for several rounds until the convergence. Practically, the clients' data is non-independent and identically distributed (Non-IID). Hence, the updated local model of each client may vary from every other client due to heterogeneity among them. Hence, the process of aggregating the diversified local models of clients has a huge impact on the performance of global training. This article proposes a performance efficient aggregation approach for federated learning, which considers the data heterogeneity among clients before aggregating the received local models. The proposed approach is compared with the conventional federated learning methods, and it achieves improved performance.\",\"PeriodicalId\":105769,\"journal\":{\"name\":\"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAIIC57133.2023.10066985\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC57133.2023.10066985","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Performance Efficient Approach of Global Training in Federated Learning
Federated learning is a novel approach of training the global model on the server by utilizing the personal data of the end users while data privacy is preserved. The users called clients are required to perform the local training using their local datasets and forward those trained local models to the server, in which the local models are aggregated to update the global model. This process of global training is carried out for several rounds until the convergence. Practically, the clients' data is non-independent and identically distributed (Non-IID). Hence, the updated local model of each client may vary from every other client due to heterogeneity among them. Hence, the process of aggregating the diversified local models of clients has a huge impact on the performance of global training. This article proposes a performance efficient aggregation approach for federated learning, which considers the data heterogeneity among clients before aggregating the received local models. The proposed approach is compared with the conventional federated learning methods, and it achieves improved performance.