基于逻辑聚类的车联网个性化联合多任务学习

Biao Zhang, Siya Xu, Xusong Qiu, Jingyue Tian
{"title":"基于逻辑聚类的车联网个性化联合多任务学习","authors":"Biao Zhang, Siya Xu, Xusong Qiu, Jingyue Tian","doi":"10.1109/BMSB58369.2023.10211135","DOIUrl":null,"url":null,"abstract":"Federated learning (FL) is an emerging distributed machine learning paradigm that emphasizes user privacy. The majority of current federated learning systems are oriented for single task while participants in the Internet of Vehicles (IoV) demand to train different models for multiple intelligent services simultaneously. As one of transfer learning methods, multi-task learning (MTL) has the potential to integrate with federated learning to realize personalized local training. However, the existing federated multi-task learning (FMTL) algorithms are faced with the problems of high implementation complexity and communication overhead. To solve the above issues, we propose a logical-cluster-based personalized federated multi-task learning framework named pFMTL. In the framework, the multi-task model is decomposed into a basic module for extracting features and K task-specific modules for outputting inferences. We leverage logical clusters and multi-task learning to enhance the personalization and generalization capability of task models, respectively. To improve the communication efficiency further, we also design a module-wise task scheduling strategy, which supports both user module scheduling and cluster aggregation scheduling to ensure the convergence of multi-task model with less communication overhead. Finally, the simulation results imply that pFMTL can increase task accuracy and reduce communication latency compared with other benchmarks.","PeriodicalId":13080,"journal":{"name":"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting","volume":"1 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Logical-Cluster-Based Personalized Federated Multi-Task Learning for Internet of Vehicles\",\"authors\":\"Biao Zhang, Siya Xu, Xusong Qiu, Jingyue Tian\",\"doi\":\"10.1109/BMSB58369.2023.10211135\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated learning (FL) is an emerging distributed machine learning paradigm that emphasizes user privacy. The majority of current federated learning systems are oriented for single task while participants in the Internet of Vehicles (IoV) demand to train different models for multiple intelligent services simultaneously. As one of transfer learning methods, multi-task learning (MTL) has the potential to integrate with federated learning to realize personalized local training. However, the existing federated multi-task learning (FMTL) algorithms are faced with the problems of high implementation complexity and communication overhead. To solve the above issues, we propose a logical-cluster-based personalized federated multi-task learning framework named pFMTL. In the framework, the multi-task model is decomposed into a basic module for extracting features and K task-specific modules for outputting inferences. We leverage logical clusters and multi-task learning to enhance the personalization and generalization capability of task models, respectively. To improve the communication efficiency further, we also design a module-wise task scheduling strategy, which supports both user module scheduling and cluster aggregation scheduling to ensure the convergence of multi-task model with less communication overhead. Finally, the simulation results imply that pFMTL can increase task accuracy and reduce communication latency compared with other benchmarks.\",\"PeriodicalId\":13080,\"journal\":{\"name\":\"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting\",\"volume\":\"1 1\",\"pages\":\"1-6\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/BMSB58369.2023.10211135\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE international Symposium on Broadband Multimedia Systems and Broadcasting","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BMSB58369.2023.10211135","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

联邦学习(FL)是一种新兴的分布式机器学习范式,强调用户隐私。当前大多数联邦学习系统面向单一任务,而车联网(IoV)参与者需要同时训练多种智能服务的不同模型。多任务学习作为迁移学习方法的一种,具有与联邦学习相结合实现个性化局部训练的潜力。然而,现有的联邦多任务学习(FMTL)算法存在实现复杂度高、通信开销大的问题。为了解决上述问题,我们提出了一个基于逻辑集群的个性化联邦多任务学习框架pFMTL。在该框架中,多任务模型被分解为用于提取特征的基本模块和用于输出推理的K个特定任务模块。我们分别利用逻辑集群和多任务学习来增强任务模型的个性化和泛化能力。为了进一步提高通信效率,我们还设计了一种基于模块的任务调度策略,该策略支持用户模块调度和集群聚合调度,以保证多任务模型的收敛性,同时减少通信开销。最后,仿真结果表明,与其他基准测试相比,pFMTL可以提高任务精度,降低通信延迟。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Logical-Cluster-Based Personalized Federated Multi-Task Learning for Internet of Vehicles
Federated learning (FL) is an emerging distributed machine learning paradigm that emphasizes user privacy. The majority of current federated learning systems are oriented for single task while participants in the Internet of Vehicles (IoV) demand to train different models for multiple intelligent services simultaneously. As one of transfer learning methods, multi-task learning (MTL) has the potential to integrate with federated learning to realize personalized local training. However, the existing federated multi-task learning (FMTL) algorithms are faced with the problems of high implementation complexity and communication overhead. To solve the above issues, we propose a logical-cluster-based personalized federated multi-task learning framework named pFMTL. In the framework, the multi-task model is decomposed into a basic module for extracting features and K task-specific modules for outputting inferences. We leverage logical clusters and multi-task learning to enhance the personalization and generalization capability of task models, respectively. To improve the communication efficiency further, we also design a module-wise task scheduling strategy, which supports both user module scheduling and cluster aggregation scheduling to ensure the convergence of multi-task model with less communication overhead. Finally, the simulation results imply that pFMTL can increase task accuracy and reduce communication latency compared with other benchmarks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Collaborative Task Offloading Based on Scalable DAG in Cell-Free HetMEC Networks Resource Pre-caching Strategy of Digital Twin System Based on Hierarchical MEC Architecture Research on key technologies of audiovisual media microservices and industry applications A Closed-loop Operation and Maintenance Architecture based on Digital Twin for Electric Power Communication Networks Edge Fusion of Intelligent Industrial Park Based on MatrixOne and Pravega
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1