{"title":"Signed graph embedding via multi-order neighborhood feature fusion and contrastive learning.","authors":"Chaobo He, Hao Cheng, Jiaqi Yang, Yong Tang, Quanlong Guan","doi":"10.1016/j.neunet.2024.106897","DOIUrl":null,"url":null,"abstract":"<p><p>Signed graphs have been widely applied to model real-world complex networks with positive and negative links, and signed graph embedding has become a popular topic in the field of signed graph analysis. Although various signed graph embedding methods have been proposed, most of them still suffer from the generality problem. Namely, they cannot simultaneously achieve the satisfactory performance in multiple downstream tasks. In view of this, in this paper we propose a signed embedding method named MOSGCN which exhibits two significant characteristics. Firstly, MOSGCN designs a multi-order neighborhood feature fusion strategy based on the structural balance theory, enabling it to adaptively capture local and global structure features for more informative node representations. Secondly, MOSGCN is trained by using the signed graph contrastive learning framework, which further helps it learn more discriminative and robust node representations, leading to the better generality. We select link sign prediction and community detection as the downstream tasks, and conduct extensive experiments to test the effectiveness of MOSGCN on four benchmark datasets. The results illustrate the good generality of MOSGCN and the superiority by comparing to state-of-the-art methods.</p>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"182 ","pages":"106897"},"PeriodicalIF":6.0000,"publicationDate":"2024-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1016/j.neunet.2024.106897","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Signed graphs have been widely applied to model real-world complex networks with positive and negative links, and signed graph embedding has become a popular topic in the field of signed graph analysis. Although various signed graph embedding methods have been proposed, most of them still suffer from the generality problem. Namely, they cannot simultaneously achieve the satisfactory performance in multiple downstream tasks. In view of this, in this paper we propose a signed embedding method named MOSGCN which exhibits two significant characteristics. Firstly, MOSGCN designs a multi-order neighborhood feature fusion strategy based on the structural balance theory, enabling it to adaptively capture local and global structure features for more informative node representations. Secondly, MOSGCN is trained by using the signed graph contrastive learning framework, which further helps it learn more discriminative and robust node representations, leading to the better generality. We select link sign prediction and community detection as the downstream tasks, and conduct extensive experiments to test the effectiveness of MOSGCN on four benchmark datasets. The results illustrate the good generality of MOSGCN and the superiority by comparing to state-of-the-art methods.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.