Multi-Granularity Weighted Federated Learning for Heterogeneous Edge Computing

IF 5.8 2区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS IEEE Transactions on Services Computing Pub Date : 2024-11-11 DOI:10.1109/TSC.2024.3495532
Yunfeng Zhao;Chao Qiu;Shangxuan Cai;Zhicheng Liu;Yu Wang;Xiaofei Wang;Qinghua Hu
{"title":"Multi-Granularity Weighted Federated Learning for Heterogeneous Edge Computing","authors":"Yunfeng Zhao;Chao Qiu;Shangxuan Cai;Zhicheng Liu;Yu Wang;Xiaofei Wang;Qinghua Hu","doi":"10.1109/TSC.2024.3495532","DOIUrl":null,"url":null,"abstract":"Federated learning (FL), an advanced variant of distributed machine learning, enables clients to collaboratively train a model without sharing raw data, thereby enhancing privacy, security, and reducing communication overhead. However, in edge computing scenarios, there is an increasing trend towards diversity, heterogeneity, and complexity in clients’ data and models. The fundamental challenges, such as non-independent and identically distributed (non-IID) data and multi-granularity data accompanied by model heterogeneity, have become more evident and pose challenges to collaborative training among clients. In this paper, we refine the FL framework and propose the Multi-granularity Weighted Federated Learning (MGW-FL), emphasizing efficient collaborative training among clients with varied data granularities and diverse model scales across distinct data distributions. We introduce a distance-based FL mechanism designed for homogeneous clients, providing personalized models to mitigate the negative effects that non-IID data might have on model aggregation. Simultaneously, we propose an attention-weighted FL mechanism enhanced by a prior attention mechanism, facilitating knowledge transfer across clients with heterogeneous data granularities and model scales. Furthermore, we provide theoretical analyses of the convergence properties of the proposed MGW-FL method for both convex and non-convex models. Experimental results on five benchmark datasets demonstrate that, compared to baseline methods, MGW-FL significantly improves accuracy by almost 150% and convergence efficiency by nearly 20% on both IID and non-IID data.","PeriodicalId":13255,"journal":{"name":"IEEE Transactions on Services Computing","volume":"18 1","pages":"270-287"},"PeriodicalIF":5.8000,"publicationDate":"2024-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Services Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10750029/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Federated learning (FL), an advanced variant of distributed machine learning, enables clients to collaboratively train a model without sharing raw data, thereby enhancing privacy, security, and reducing communication overhead. However, in edge computing scenarios, there is an increasing trend towards diversity, heterogeneity, and complexity in clients’ data and models. The fundamental challenges, such as non-independent and identically distributed (non-IID) data and multi-granularity data accompanied by model heterogeneity, have become more evident and pose challenges to collaborative training among clients. In this paper, we refine the FL framework and propose the Multi-granularity Weighted Federated Learning (MGW-FL), emphasizing efficient collaborative training among clients with varied data granularities and diverse model scales across distinct data distributions. We introduce a distance-based FL mechanism designed for homogeneous clients, providing personalized models to mitigate the negative effects that non-IID data might have on model aggregation. Simultaneously, we propose an attention-weighted FL mechanism enhanced by a prior attention mechanism, facilitating knowledge transfer across clients with heterogeneous data granularities and model scales. Furthermore, we provide theoretical analyses of the convergence properties of the proposed MGW-FL method for both convex and non-convex models. Experimental results on five benchmark datasets demonstrate that, compared to baseline methods, MGW-FL significantly improves accuracy by almost 150% and convergence efficiency by nearly 20% on both IID and non-IID data.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
异构边缘计算的多粒度加权联合学习
联邦学习(FL)是分布式机器学习的一种高级变体,它使客户端能够在不共享原始数据的情况下协作训练模型,从而增强隐私、安全性并减少通信开销。然而,在边缘计算场景中,客户数据和模型的多样性、异质性和复杂性趋势日益明显。非独立同分布(non-IID)数据和多粒度数据伴随模型异质性等根本性挑战日益明显,给客户间的协同培训带来挑战。在本文中,我们改进了FL框架并提出了多粒度加权联邦学习(MGW-FL),强调在不同数据分布的不同数据粒度和不同模型尺度的客户端之间进行有效的协作训练。我们引入了一种为同构客户端设计的基于距离的FL机制,提供个性化模型,以减轻非iid数据可能对模型聚合产生的负面影响。同时,我们提出了一种由先验注意机制增强的注意力加权FL机制,以促进具有异构数据粒度和模型规模的客户端之间的知识转移。此外,我们对所提出的MGW-FL方法在凸和非凸模型下的收敛性进行了理论分析。在5个基准数据集上的实验结果表明,与基线方法相比,MGW-FL在IID和非IID数据上的精度提高了近150%,收敛效率提高了近20%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Services Computing
IEEE Transactions on Services Computing COMPUTER SCIENCE, INFORMATION SYSTEMS-COMPUTER SCIENCE, SOFTWARE ENGINEERING
CiteScore
11.50
自引率
6.20%
发文量
278
审稿时长
>12 weeks
期刊介绍: IEEE Transactions on Services Computing encompasses the computing and software aspects of the science and technology of services innovation research and development. It places emphasis on algorithmic, mathematical, statistical, and computational methods central to services computing. Topics covered include Service Oriented Architecture, Web Services, Business Process Integration, Solution Performance Management, and Services Operations and Management. The transactions address mathematical foundations, security, privacy, agreement, contract, discovery, negotiation, collaboration, and quality of service for web services. It also covers areas like composite web service creation, business and scientific applications, standards, utility models, business process modeling, integration, collaboration, and more in the realm of Services Computing.
期刊最新文献
A Hierarchical GNN-based Multi-Agent Framework for Workflow Scheduling in Hybrid Clouds Considering Privacy Constraints DUAL: A Federated Unsupervised Anomaly Detection Framework for Collaborative Business Processes Combating Free-Riding in AIGC Service System: a Decentralized Reputation-based Model Management Approach Privacy-Preserving Service Migration for Multi-User Metaverse Environments Collaborative Orchestration of Microservices and AI Services in Edges: A Dual-Time-Scale Reinforcement Learning Approach
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1