{"title":"SC-FGCL: Self-Adaptive Cluster-Based Federal Graph Contrastive Learning","authors":"Tingqi Wang;Xu Zheng;Lei Gao;Tianqi Wan;Ling Tian","doi":"10.1109/OJCS.2023.3235593","DOIUrl":null,"url":null,"abstract":"As a self-supervised learning method, the graph contrastive learning achieve admirable performance in graph pre-training tasks, and can be fine-tuned for multiple downstream tasks such as protein structure prediction, social recommendation, \n<italic>etc.</i>\n One prerequisite for graph contrastive learning is the support of huge graphs in the training procedure. However, the graph data nowadays are distributed in various devices and hold by different owners, like those smart devices in Internet of Things. Considering the non-negligible consumptions on computing, storage, communication, data privacy and other issues, these devices often prefer to keep data locally, which significantly reduces the graph contrastive learning performance. In this paper, we propose a novel federal graph contrastive learning framework. First, it is able to update node embeddings during training by means of a federation method, allowing the local GCL to acquire anchors with richer information. Second, we design a Self-adaptive Cluster-based server strategy to select the optimal embedding update scheme, which maximizes the richness of the embedding information while avoiding the interference of noise. Generally, our method can build anchors with richer information through a federated learning approach, thus alleviating the performance degradation of graph contrastive learning due to distributed storage. Extensive analysis and experimental results demonstrate the superiority of our framework.","PeriodicalId":13205,"journal":{"name":"IEEE Open Journal of the Computer Society","volume":"4 ","pages":"13-22"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/iel7/8782664/10016900/10015148.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Open Journal of the Computer Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10015148/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
As a self-supervised learning method, the graph contrastive learning achieve admirable performance in graph pre-training tasks, and can be fine-tuned for multiple downstream tasks such as protein structure prediction, social recommendation,
etc.
One prerequisite for graph contrastive learning is the support of huge graphs in the training procedure. However, the graph data nowadays are distributed in various devices and hold by different owners, like those smart devices in Internet of Things. Considering the non-negligible consumptions on computing, storage, communication, data privacy and other issues, these devices often prefer to keep data locally, which significantly reduces the graph contrastive learning performance. In this paper, we propose a novel federal graph contrastive learning framework. First, it is able to update node embeddings during training by means of a federation method, allowing the local GCL to acquire anchors with richer information. Second, we design a Self-adaptive Cluster-based server strategy to select the optimal embedding update scheme, which maximizes the richness of the embedding information while avoiding the interference of noise. Generally, our method can build anchors with richer information through a federated learning approach, thus alleviating the performance degradation of graph contrastive learning due to distributed storage. Extensive analysis and experimental results demonstrate the superiority of our framework.