{"title":"FFD: A Full-Stack Federated Distillation method for Heterogeneous Massive IoT Networks","authors":"Minh-Duong Nguyen, Hong-Son Luong, Tung-Nguyen, Viet Quoc Pham, Q. Do, W. Hwang","doi":"10.1109/ATC55345.2022.9943034","DOIUrl":null,"url":null,"abstract":"Data imbalance and complexity are the key challenges of applying federated learning (FL) techniques for wireless networks. In this paper, we propose a novel framework inspired by a divide-and-conquer algorithm. We aim to develop a full-stack federated distillation (FFD) method for federated learning over a massive Internet of Things network. We first divide the network into sub-regions that can be represented by a neural network model. After performing local training, these models are then aggregated into a global model by using a novel knowledge-distillation method. This FFD method allows each local model to be efficiently updated by learning the features of the other models. Furthermore, this method can be easily deployed in new and large-scaled environments without requiring the models to be re-trained from scratch. Finally, we conduct extensive simulations to evaluate the performance of the proposed FFD method. The results show that our solution outperforms many contemporary FL techniques with non-IID (i.e., not independent and identically distributed) and imbalanced data.","PeriodicalId":135827,"journal":{"name":"2022 International Conference on Advanced Technologies for Communications (ATC)","volume":"122 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Advanced Technologies for Communications (ATC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ATC55345.2022.9943034","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Data imbalance and complexity are the key challenges of applying federated learning (FL) techniques for wireless networks. In this paper, we propose a novel framework inspired by a divide-and-conquer algorithm. We aim to develop a full-stack federated distillation (FFD) method for federated learning over a massive Internet of Things network. We first divide the network into sub-regions that can be represented by a neural network model. After performing local training, these models are then aggregated into a global model by using a novel knowledge-distillation method. This FFD method allows each local model to be efficiently updated by learning the features of the other models. Furthermore, this method can be easily deployed in new and large-scaled environments without requiring the models to be re-trained from scratch. Finally, we conduct extensive simulations to evaluate the performance of the proposed FFD method. The results show that our solution outperforms many contemporary FL techniques with non-IID (i.e., not independent and identically distributed) and imbalanced data.