{"title":"WAFL-GAN: Wireless Ad Hoc Federated Learning for Distributed Generative Adversarial Networks","authors":"Eisuke Tomiyama, H. Esaki, H. Ochiai","doi":"10.1109/KST57286.2023.10086811","DOIUrl":null,"url":null,"abstract":"Diverse images are needed to train Generative Adversarial Network (GAN) with diverse image output, but privacy is a major issue. To protect privacy, federated learning has been proposed, but in conventional federated learning, the parameter server is a third party to the client. We propose WAFL-GAN, which does not require a third party, and which assumes that each node participating in the learning process is mobile and can communicate wirelessly with each other. Each node is trained only with the data it has locally, and when nodes opportunistically contact each other, they exchange and aggregate model parameters without exchanging raw data. This allows all nodes to eventually have a general model and produce a general output, even if each node has a dataset with a Non-IID distribution. We evaluated WAFL-GAN on the Non-IID MNIST dataset and quantitatively showed that the output diversity of WAFL-GAN can be as high as that of conventional federated learning.","PeriodicalId":351833,"journal":{"name":"2023 15th International Conference on Knowledge and Smart Technology (KST)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 15th International Conference on Knowledge and Smart Technology (KST)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/KST57286.2023.10086811","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Diverse images are needed to train Generative Adversarial Network (GAN) with diverse image output, but privacy is a major issue. To protect privacy, federated learning has been proposed, but in conventional federated learning, the parameter server is a third party to the client. We propose WAFL-GAN, which does not require a third party, and which assumes that each node participating in the learning process is mobile and can communicate wirelessly with each other. Each node is trained only with the data it has locally, and when nodes opportunistically contact each other, they exchange and aggregate model parameters without exchanging raw data. This allows all nodes to eventually have a general model and produce a general output, even if each node has a dataset with a Non-IID distribution. We evaluated WAFL-GAN on the Non-IID MNIST dataset and quantitatively showed that the output diversity of WAFL-GAN can be as high as that of conventional federated learning.