高效联盟学习的多标准客户端选择

Mehreen Tahir, Muhammad Intizar Ali
{"title":"高效联盟学习的多标准客户端选择","authors":"Mehreen Tahir, Muhammad Intizar Ali","doi":"10.1609/aaaiss.v3i1.31227","DOIUrl":null,"url":null,"abstract":"Federated Learning (FL) has received tremendous attention as a decentralized machine learning (ML) framework that allows distributed data owners to collaboratively train a global model without sharing raw data. Since FL trains the model directly on edge devices, the heterogeneity of participating clients in terms of data distribution, hardware capabilities and network connectivity can significantly impact the overall performance of FL systems. Optimizing for model accuracy could extend the training time due to the diverse and resource-constrained nature of edge devices while minimizing training time could compromise the model's accuracy. Effective client selection thus becomes crucial to ensure that the training process is not only efficient but also capitalizes on the diverse data and computational capabilities of different devices. To this end, we propose FedPROM, a novel framework that tackles client selection in FL as a multi-criteria optimization problem. By leveraging the PROMETHEE method, FedPROM ranks clients based on their suitability for a given FL task, considering multiple criteria such as system resources, network conditions, and data quality. This approach allows FedPROM to dynamically select the most appropriate set of clients for each learning round, optimizing both model accuracy and training efficiency. Our evaluations on diverse datasets demonstrate that FedPROM outperforms several state-of-the-art FL client selection protocols in terms of convergence speed, and accuracy, highlighting the framework's effectiveness and the importance of multi-criteria client selection in FL.","PeriodicalId":516827,"journal":{"name":"Proceedings of the AAAI Symposium Series","volume":"30 10","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-Criterion Client Selection for Efficient Federated Learning\",\"authors\":\"Mehreen Tahir, Muhammad Intizar Ali\",\"doi\":\"10.1609/aaaiss.v3i1.31227\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Federated Learning (FL) has received tremendous attention as a decentralized machine learning (ML) framework that allows distributed data owners to collaboratively train a global model without sharing raw data. Since FL trains the model directly on edge devices, the heterogeneity of participating clients in terms of data distribution, hardware capabilities and network connectivity can significantly impact the overall performance of FL systems. Optimizing for model accuracy could extend the training time due to the diverse and resource-constrained nature of edge devices while minimizing training time could compromise the model's accuracy. Effective client selection thus becomes crucial to ensure that the training process is not only efficient but also capitalizes on the diverse data and computational capabilities of different devices. To this end, we propose FedPROM, a novel framework that tackles client selection in FL as a multi-criteria optimization problem. By leveraging the PROMETHEE method, FedPROM ranks clients based on their suitability for a given FL task, considering multiple criteria such as system resources, network conditions, and data quality. This approach allows FedPROM to dynamically select the most appropriate set of clients for each learning round, optimizing both model accuracy and training efficiency. Our evaluations on diverse datasets demonstrate that FedPROM outperforms several state-of-the-art FL client selection protocols in terms of convergence speed, and accuracy, highlighting the framework's effectiveness and the importance of multi-criteria client selection in FL.\",\"PeriodicalId\":516827,\"journal\":{\"name\":\"Proceedings of the AAAI Symposium Series\",\"volume\":\"30 10\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-05-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the AAAI Symposium Series\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1609/aaaiss.v3i1.31227\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the AAAI Symposium Series","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1609/aaaiss.v3i1.31227","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

联邦学习(FL)作为一种去中心化的机器学习(ML)框架受到了极大的关注,它允许分布式数据所有者在不共享原始数据的情况下协作训练一个全局模型。由于联邦学习直接在边缘设备上训练模型,参与的客户端在数据分布、硬件能力和网络连接方面的异质性会极大地影响联邦学习系统的整体性能。由于边缘设备的多样性和资源有限性,优化模型准确性可能会延长训练时间,而尽量缩短训练时间则可能会影响模型的准确性。因此,有效的客户端选择对于确保训练过程不仅高效,而且充分利用不同设备的数据和计算能力至关重要。为此,我们提出了 FedPROM,这是一个新颖的框架,将 FL 中的客户端选择作为一个多标准优化问题来处理。通过利用 PROMETHEE 方法,FedPROM 在考虑系统资源、网络条件和数据质量等多重标准的基础上,根据客户端对特定 FL 任务的适用性对其进行排序。通过这种方法,FedPROM 可以为每一轮学习动态选择最合适的客户端集,从而优化模型准确性和训练效率。我们在不同数据集上进行的评估表明,FedPROM 在收敛速度和准确性方面都优于几种最先进的 FL 客户端选择协议,这凸显了该框架的有效性以及多标准客户端选择在 FL 中的重要性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Multi-Criterion Client Selection for Efficient Federated Learning
Federated Learning (FL) has received tremendous attention as a decentralized machine learning (ML) framework that allows distributed data owners to collaboratively train a global model without sharing raw data. Since FL trains the model directly on edge devices, the heterogeneity of participating clients in terms of data distribution, hardware capabilities and network connectivity can significantly impact the overall performance of FL systems. Optimizing for model accuracy could extend the training time due to the diverse and resource-constrained nature of edge devices while minimizing training time could compromise the model's accuracy. Effective client selection thus becomes crucial to ensure that the training process is not only efficient but also capitalizes on the diverse data and computational capabilities of different devices. To this end, we propose FedPROM, a novel framework that tackles client selection in FL as a multi-criteria optimization problem. By leveraging the PROMETHEE method, FedPROM ranks clients based on their suitability for a given FL task, considering multiple criteria such as system resources, network conditions, and data quality. This approach allows FedPROM to dynamically select the most appropriate set of clients for each learning round, optimizing both model accuracy and training efficiency. Our evaluations on diverse datasets demonstrate that FedPROM outperforms several state-of-the-art FL client selection protocols in terms of convergence speed, and accuracy, highlighting the framework's effectiveness and the importance of multi-criteria client selection in FL.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Modes of Tracking Mal-Info in Social Media with AI/ML Tools to Help Mitigate Harmful GenAI for Improved Societal Well Being Embodying Human-Like Modes of Balance Control Through Human-In-the-Loop Dyadic Learning Constructing Deep Concepts through Shallow Search Implications of Identity in AI: Creators, Creations, and Consequences ASMR: Aggregated Semantic Matching Retrieval Unleashing Commonsense Ability of LLM through Open-Ended Question Answering
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1