FedART: A neural model integrating federated learning and adaptive resonance theory

IF 6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neural Networks Pub Date : 2024-11-04 DOI:10.1016/j.neunet.2024.106845
Shubham Pateria, Budhitama Subagdja, Ah-Hwee Tan
{"title":"FedART: A neural model integrating federated learning and adaptive resonance theory","authors":"Shubham Pateria,&nbsp;Budhitama Subagdja,&nbsp;Ah-Hwee Tan","doi":"10.1016/j.neunet.2024.106845","DOIUrl":null,"url":null,"abstract":"<div><div>Federated Learning (FL) has emerged as a promising paradigm for collaborative model training across distributed clients while preserving data privacy. However, prevailing FL approaches aggregate the clients’ local models into a global model through multi-round iterative parameter averaging. This leads to the undesirable bias of the aggregated model towards certain clients in the presence of heterogeneous data distributions among the clients. Moreover, such approaches are restricted to supervised classification tasks and do not support unsupervised clustering. To address these limitations, we propose a novel one-shot FL approach called Federated Adaptive Resonance Theory (FedART) which leverages self-organizing Adaptive Resonance Theory (ART) models to learn category codes, where each code represents a cluster of similar data samples. In FedART, the clients learn to associate their private data with various local category codes. Under heterogeneity, the local codes across different clients represent heterogeneous data. In turn, a global model takes these local codes as inputs and aggregates them into global category codes, wherein heterogeneous client data is indirectly represented by distinctly encoded global codes, in contrast to the averaging out of parameters in the existing approaches. This enables the learned global model to handle heterogeneous data. In addition, FedART employs a universal learning mechanism to support both federated classification and clustering tasks. Our experiments conducted on various federated classification and clustering tasks show that FedART consistently outperforms state-of-the-art FL methods on data with heterogeneous distribution across clients.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"181 ","pages":"Article 106845"},"PeriodicalIF":6.0000,"publicationDate":"2024-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S089360802400769X","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Federated Learning (FL) has emerged as a promising paradigm for collaborative model training across distributed clients while preserving data privacy. However, prevailing FL approaches aggregate the clients’ local models into a global model through multi-round iterative parameter averaging. This leads to the undesirable bias of the aggregated model towards certain clients in the presence of heterogeneous data distributions among the clients. Moreover, such approaches are restricted to supervised classification tasks and do not support unsupervised clustering. To address these limitations, we propose a novel one-shot FL approach called Federated Adaptive Resonance Theory (FedART) which leverages self-organizing Adaptive Resonance Theory (ART) models to learn category codes, where each code represents a cluster of similar data samples. In FedART, the clients learn to associate their private data with various local category codes. Under heterogeneity, the local codes across different clients represent heterogeneous data. In turn, a global model takes these local codes as inputs and aggregates them into global category codes, wherein heterogeneous client data is indirectly represented by distinctly encoded global codes, in contrast to the averaging out of parameters in the existing approaches. This enables the learned global model to handle heterogeneous data. In addition, FedART employs a universal learning mechanism to support both federated classification and clustering tasks. Our experiments conducted on various federated classification and clustering tasks show that FedART consistently outperforms state-of-the-art FL methods on data with heterogeneous distribution across clients.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
FedART:融合了联合学习和自适应共振理论的神经模型。
联盟学习(FL)已成为跨分布式客户端协同模型训练的一种有前途的模式,同时还能保护数据隐私。然而,现有的联合学习方法通过多轮迭代参数平均将客户机的本地模型聚合成一个全局模型。这就导致了在客户端之间存在异构数据分布的情况下,聚合模型会对某些客户端产生不良偏差。此外,这种方法仅限于监督分类任务,不支持无监督聚类。为了解决这些局限性,我们提出了一种名为 "联合自适应共振理论(FedART)"的新颖的单次 FL 方法,它利用自组织自适应共振理论(ART)模型来学习类别代码,其中每个代码代表一个相似数据样本集群。在 FedART 中,客户端学习将其私人数据与各种本地类别代码相关联。在异质性条件下,不同客户端的本地代码代表异质性数据。反过来,全局模型将这些本地代码作为输入,并将其聚合为全局类别代码,其中,异构客户端数据由不同编码的全局代码间接表示,这与现有方法中的参数平均化形成鲜明对比。这使得学习到的全局模型能够处理异构数据。此外,FedART 还采用了一种通用学习机制,以支持联盟分类和聚类任务。我们在各种联合分类和聚类任务上进行的实验表明,在客户端异构分布的数据上,FedART 的表现始终优于最先进的 FL 方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
期刊最新文献
Multi-source Selective Graph Domain Adaptation Network for cross-subject EEG emotion recognition. Spectral integrated neural networks (SINNs) for solving forward and inverse dynamic problems. Corrigendum to "Hydra: Multi-head Low-rank Adaptation for Parameter Efficient Fine-tuning" [Neural Networks Volume 178, October (2024), 1-11/106414]]. MIU-Net: Advanced multi-scale feature extraction and imbalance mitigation for optic disc segmentation Recovering Permuted Sequential Features for effective Reinforcement Learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1