SC-FGCL:基于自适应聚类的联邦图对比学习

Tingqi Wang;Xu Zheng;Lei Gao;Tianqi Wan;Ling Tian
{"title":"SC-FGCL:基于自适应聚类的联邦图对比学习","authors":"Tingqi Wang;Xu Zheng;Lei Gao;Tianqi Wan;Ling Tian","doi":"10.1109/OJCS.2023.3235593","DOIUrl":null,"url":null,"abstract":"As a self-supervised learning method, the graph contrastive learning achieve admirable performance in graph pre-training tasks, and can be fine-tuned for multiple downstream tasks such as protein structure prediction, social recommendation, \n<italic>etc.</i>\n One prerequisite for graph contrastive learning is the support of huge graphs in the training procedure. However, the graph data nowadays are distributed in various devices and hold by different owners, like those smart devices in Internet of Things. Considering the non-negligible consumptions on computing, storage, communication, data privacy and other issues, these devices often prefer to keep data locally, which significantly reduces the graph contrastive learning performance. In this paper, we propose a novel federal graph contrastive learning framework. First, it is able to update node embeddings during training by means of a federation method, allowing the local GCL to acquire anchors with richer information. Second, we design a Self-adaptive Cluster-based server strategy to select the optimal embedding update scheme, which maximizes the richness of the embedding information while avoiding the interference of noise. Generally, our method can build anchors with richer information through a federated learning approach, thus alleviating the performance degradation of graph contrastive learning due to distributed storage. Extensive analysis and experimental results demonstrate the superiority of our framework.","PeriodicalId":13205,"journal":{"name":"IEEE Open Journal of the Computer Society","volume":"4 ","pages":"13-22"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/iel7/8782664/10016900/10015148.pdf","citationCount":"0","resultStr":"{\"title\":\"SC-FGCL: Self-Adaptive Cluster-Based Federal Graph Contrastive Learning\",\"authors\":\"Tingqi Wang;Xu Zheng;Lei Gao;Tianqi Wan;Ling Tian\",\"doi\":\"10.1109/OJCS.2023.3235593\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As a self-supervised learning method, the graph contrastive learning achieve admirable performance in graph pre-training tasks, and can be fine-tuned for multiple downstream tasks such as protein structure prediction, social recommendation, \\n<italic>etc.</i>\\n One prerequisite for graph contrastive learning is the support of huge graphs in the training procedure. However, the graph data nowadays are distributed in various devices and hold by different owners, like those smart devices in Internet of Things. Considering the non-negligible consumptions on computing, storage, communication, data privacy and other issues, these devices often prefer to keep data locally, which significantly reduces the graph contrastive learning performance. In this paper, we propose a novel federal graph contrastive learning framework. First, it is able to update node embeddings during training by means of a federation method, allowing the local GCL to acquire anchors with richer information. Second, we design a Self-adaptive Cluster-based server strategy to select the optimal embedding update scheme, which maximizes the richness of the embedding information while avoiding the interference of noise. Generally, our method can build anchors with richer information through a federated learning approach, thus alleviating the performance degradation of graph contrastive learning due to distributed storage. Extensive analysis and experimental results demonstrate the superiority of our framework.\",\"PeriodicalId\":13205,\"journal\":{\"name\":\"IEEE Open Journal of the Computer Society\",\"volume\":\"4 \",\"pages\":\"13-22\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/iel7/8782664/10016900/10015148.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Open Journal of the Computer Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10015148/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of the Computer Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10015148/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

作为一种自监督学习方法,图对比学习在图预训练任务中取得了令人钦佩的性能,并且可以针对蛋白质结构预测、社交推荐等多个下游任务进行微调。图对比学习的一个先决条件是在训练过程中支持巨大的图。然而,如今的图形数据分布在各种设备中,由不同的所有者持有,就像物联网中的智能设备一样。考虑到在计算、存储、通信、数据隐私等问题上不可忽略的消耗,这些设备往往倾向于将数据保存在本地,这大大降低了图形对比学习的性能。在本文中,我们提出了一个新的联邦图对比学习框架。首先,它能够通过联合方法在训练期间更新节点嵌入,允许本地GCL获取具有更丰富信息的锚。其次,我们设计了一种自适应的基于集群的服务器策略来选择最优的嵌入更新方案,该方案最大限度地提高了嵌入信息的丰富性,同时避免了噪声的干扰。通常,我们的方法可以通过联合学习方法构建具有更丰富信息的锚点,从而缓解由于分布式存储导致的图对比学习的性能下降。大量的分析和实验结果证明了我们的框架的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
SC-FGCL: Self-Adaptive Cluster-Based Federal Graph Contrastive Learning
As a self-supervised learning method, the graph contrastive learning achieve admirable performance in graph pre-training tasks, and can be fine-tuned for multiple downstream tasks such as protein structure prediction, social recommendation, etc. One prerequisite for graph contrastive learning is the support of huge graphs in the training procedure. However, the graph data nowadays are distributed in various devices and hold by different owners, like those smart devices in Internet of Things. Considering the non-negligible consumptions on computing, storage, communication, data privacy and other issues, these devices often prefer to keep data locally, which significantly reduces the graph contrastive learning performance. In this paper, we propose a novel federal graph contrastive learning framework. First, it is able to update node embeddings during training by means of a federation method, allowing the local GCL to acquire anchors with richer information. Second, we design a Self-adaptive Cluster-based server strategy to select the optimal embedding update scheme, which maximizes the richness of the embedding information while avoiding the interference of noise. Generally, our method can build anchors with richer information through a federated learning approach, thus alleviating the performance degradation of graph contrastive learning due to distributed storage. Extensive analysis and experimental results demonstrate the superiority of our framework.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
12.60
自引率
0.00%
发文量
0
期刊最新文献
Enhancing Cross-Language Multimodal Emotion Recognition With Dual Attention Transformers Video-Based Deception Detection via Capsule Network With Channel-Wise Attention and Supervised Contrastive Learning An Auditable, Privacy-Preserving, Transparent Unspent Transaction Output Model for Blockchain-Based Central Bank Digital Currency An Innovative Dense ResU-Net Architecture With T-Max-Avg Pooling for Advanced Crack Detection in Concrete Structures Polarity Classification of Low Resource Roman Urdu and Movie Reviews Sentiments Using Machine Learning-Based Ensemble Approaches
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1