{"title":"基于逻辑聚类的车联网个性化联合多任务学习","authors":"Biao Zhang, Siya Xu, Xusong Qiu, Jingyue Tian","doi":"10.1109/BMSB58369.2023.10211135","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) is an emerging distributed machine learning paradigm that emphasizes user privacy. The majority of current federated learning systems are oriented for single task while participants in the Internet of Vehicles (IoV) demand to train different models for multiple intelligent services simultaneously. As one of transfer learning methods, multi-task learning (MTL) has the potential to integrate with federated learning to realize personalized local training. However, the existing federated multi-task learning (FMTL) algorithms are faced with the problems of high implementation complexity and communication overhead. To solve the above issues, we propose a logical-cluster-based personalized federated multi-task learning framework named pFMTL. In the framework, the multi-task model is decomposed into a basic module for extracting features and K task-specific modules for outputting inferences. We leverage logical clusters and multi-task learning to enhance the personalization and generalization capability of task models, respectively. To improve the communication efficiency further, we also design a module-wise task scheduling strategy, which supports both user module scheduling and cluster aggregation scheduling to ensure the convergence of multi-task model with less communication overhead. Finally, the simulation results imply that pFMTL can increase task accuracy and reduce communication latency compared with other benchmarks.","PeriodicalId":13080,"journal":{"name":"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting","volume":"1 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Logical-Cluster-Based Personalized Federated Multi-Task Learning for Internet of Vehicles\",\"authors\":\"Biao Zhang, Siya Xu, Xusong Qiu, Jingyue Tian\",\"doi\":\"10.1109/BMSB58369.2023.10211135\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning (FL) is an emerging distributed machine learning paradigm that emphasizes user privacy. The majority of current federated learning systems are oriented for single task while participants in the Internet of Vehicles (IoV) demand to train different models for multiple intelligent services simultaneously. As one of transfer learning methods, multi-task learning (MTL) has the potential to integrate with federated learning to realize personalized local training. However, the existing federated multi-task learning (FMTL) algorithms are faced with the problems of high implementation complexity and communication overhead. To solve the above issues, we propose a logical-cluster-based personalized federated multi-task learning framework named pFMTL. In the framework, the multi-task model is decomposed into a basic module for extracting features and K task-specific modules for outputting inferences. We leverage logical clusters and multi-task learning to enhance the personalization and generalization capability of task models, respectively. To improve the communication efficiency further, we also design a module-wise task scheduling strategy, which supports both user module scheduling and cluster aggregation scheduling to ensure the convergence of multi-task model with less communication overhead. Finally, the simulation results imply that pFMTL can increase task accuracy and reduce communication latency compared with other benchmarks.\",\"PeriodicalId\":13080,\"journal\":{\"name\":\"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting\",\"volume\":\"1 1\",\"pages\":\"1-6\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BMSB58369.2023.10211135\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BMSB58369.2023.10211135","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Logical-Cluster-Based Personalized Federated Multi-Task Learning for Internet of Vehicles
Federated learning (FL) is an emerging distributed machine learning paradigm that emphasizes user privacy. The majority of current federated learning systems are oriented for single task while participants in the Internet of Vehicles (IoV) demand to train different models for multiple intelligent services simultaneously. As one of transfer learning methods, multi-task learning (MTL) has the potential to integrate with federated learning to realize personalized local training. However, the existing federated multi-task learning (FMTL) algorithms are faced with the problems of high implementation complexity and communication overhead. To solve the above issues, we propose a logical-cluster-based personalized federated multi-task learning framework named pFMTL. In the framework, the multi-task model is decomposed into a basic module for extracting features and K task-specific modules for outputting inferences. We leverage logical clusters and multi-task learning to enhance the personalization and generalization capability of task models, respectively. To improve the communication efficiency further, we also design a module-wise task scheduling strategy, which supports both user module scheduling and cluster aggregation scheduling to ensure the convergence of multi-task model with less communication overhead. Finally, the simulation results imply that pFMTL can increase task accuracy and reduce communication latency compared with other benchmarks.