{"title":"基于模型相似度的通信高效异构联邦学习","authors":"Zhaojie Li, T. Ohtsuki, Guan Gui","doi":"10.1109/WCNC55385.2023.10118862","DOIUrl":null,"url":null,"abstract":"Federated Learning is now widely used to train neural networks under distributed datasets. One of the main challenges in Federated Learning is to address network training under local data heterogeneity. Existing work proposes that taking similarity into account as an influence factor in federated learning can improve the speed of model aggregation. We propose a novel approach that introduces Centered Kernel Alignment (CKA) into loss function to compute the similarity of feature maps in the output layer. Compared to existing methods, our method enables fast model aggregation and improves global model accuracy in non-IID scenario by using Resnet50.","PeriodicalId":259116,"journal":{"name":"2023 IEEE Wireless Communications and Networking Conference (WCNC)","volume":"928 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Communication Efficient Heterogeneous Federated Learning based on Model Similarity\",\"authors\":\"Zhaojie Li, T. Ohtsuki, Guan Gui\",\"doi\":\"10.1109/WCNC55385.2023.10118862\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated Learning is now widely used to train neural networks under distributed datasets. One of the main challenges in Federated Learning is to address network training under local data heterogeneity. Existing work proposes that taking similarity into account as an influence factor in federated learning can improve the speed of model aggregation. We propose a novel approach that introduces Centered Kernel Alignment (CKA) into loss function to compute the similarity of feature maps in the output layer. Compared to existing methods, our method enables fast model aggregation and improves global model accuracy in non-IID scenario by using Resnet50.\",\"PeriodicalId\":259116,\"journal\":{\"name\":\"2023 IEEE Wireless Communications and Networking Conference (WCNC)\",\"volume\":\"928 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Wireless Communications and Networking Conference (WCNC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WCNC55385.2023.10118862\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Wireless Communications and Networking Conference (WCNC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WCNC55385.2023.10118862","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Communication Efficient Heterogeneous Federated Learning based on Model Similarity
Federated Learning is now widely used to train neural networks under distributed datasets. One of the main challenges in Federated Learning is to address network training under local data heterogeneity. Existing work proposes that taking similarity into account as an influence factor in federated learning can improve the speed of model aggregation. We propose a novel approach that introduces Centered Kernel Alignment (CKA) into loss function to compute the similarity of feature maps in the output layer. Compared to existing methods, our method enables fast model aggregation and improves global model accuracy in non-IID scenario by using Resnet50.