基于同质性增强的半全局图转换器半监督节点分类

Jin Li, Yisong Huang, Xinlong Chen, Yanglan Fu
{"title":"基于同质性增强的半全局图转换器半监督节点分类","authors":"Jin Li, Yisong Huang, Xinlong Chen, Yanglan Fu","doi":"10.1142/s012962642340008x","DOIUrl":null,"url":null,"abstract":"As a kind of generalization of Transformers in the graph domain, Global Graph Transformers are good at learning distant knowledge by directly doing information interactions on complete graphs, which differs from Local Graph Transformers interacting on the original structures. However, we find that most prior works focus only on graph-level tasks (e.g., graph classification) and few Graph Transformer models can effectively solve node-level tasks, especially semi-supervised node classification, which obviously has important practical significance due to the limitation and expensiveness of these node labels. In order to fill this gap, this paper first summarizes the theoretical advantages of Graph Transformers. And based on some exploring experiments, we give some discussions on the main cause of their poor practical performance in semi-supervised node classifications. Secondly, based on this analysis, we design a three-stage homogeneity augmentation framework and propose a Semi-Global Graph Transformer. Considering both global and local perspectives, the proposed model combines various technologies including self-distillation, pseudo-label filtering, pre-training and fine-tuning, and metric learning. Furthermore, it simultaneously enhances the structure and the optimization, improving its effectiveness, scalability, and generalizability. Finally, extensive experiments on seven public homogeneous and heterophilous graph benchmarks show that the proposed method can achieve competitive or much better results compared to many baseline models including state-of-the-arts.","PeriodicalId":422436,"journal":{"name":"Parallel Process. Lett.","volume":"2011 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Semi-Supervised Node Classification via Semi-Global Graph Transformer Based on Homogeneity Augmentation\",\"authors\":\"Jin Li, Yisong Huang, Xinlong Chen, Yanglan Fu\",\"doi\":\"10.1142/s012962642340008x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As a kind of generalization of Transformers in the graph domain, Global Graph Transformers are good at learning distant knowledge by directly doing information interactions on complete graphs, which differs from Local Graph Transformers interacting on the original structures. However, we find that most prior works focus only on graph-level tasks (e.g., graph classification) and few Graph Transformer models can effectively solve node-level tasks, especially semi-supervised node classification, which obviously has important practical significance due to the limitation and expensiveness of these node labels. In order to fill this gap, this paper first summarizes the theoretical advantages of Graph Transformers. And based on some exploring experiments, we give some discussions on the main cause of their poor practical performance in semi-supervised node classifications. Secondly, based on this analysis, we design a three-stage homogeneity augmentation framework and propose a Semi-Global Graph Transformer. Considering both global and local perspectives, the proposed model combines various technologies including self-distillation, pseudo-label filtering, pre-training and fine-tuning, and metric learning. Furthermore, it simultaneously enhances the structure and the optimization, improving its effectiveness, scalability, and generalizability. Finally, extensive experiments on seven public homogeneous and heterophilous graph benchmarks show that the proposed method can achieve competitive or much better results compared to many baseline models including state-of-the-arts.\",\"PeriodicalId\":422436,\"journal\":{\"name\":\"Parallel Process. Lett.\",\"volume\":\"2011 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Parallel Process. Lett.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1142/s012962642340008x\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Parallel Process. Lett.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s012962642340008x","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

与局部图变换在原始结构上进行交互不同,全局图变换是在图域上的一种推广,它擅长通过直接在完全图上进行信息交互来学习远程知识。然而,我们发现大多数先前的工作只关注图级任务(如图分类),很少有graph Transformer模型可以有效地解决节点级任务,特别是半监督节点分类,由于这些节点标签的局限性和昂贵性,这显然具有重要的现实意义。为了填补这一空白,本文首先总结了图形变压器的理论优势。在探索性实验的基础上,讨论了它们在半监督节点分类中实际性能不佳的主要原因。其次,在此基础上,设计了一种三阶段同质性增强框架,并提出了一种半全局图转换器。考虑到全局和局部视角,该模型结合了各种技术,包括自蒸馏、伪标签过滤、预训练和微调以及度量学习。同时对系统结构进行了优化,提高了系统的有效性、可扩展性和通用性。最后,在7个公开的同质和异恋图基准上进行的大量实验表明,与包括最先进的基线模型在内的许多基线模型相比,所提出的方法可以获得具有竞争力或更好的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Semi-Supervised Node Classification via Semi-Global Graph Transformer Based on Homogeneity Augmentation
As a kind of generalization of Transformers in the graph domain, Global Graph Transformers are good at learning distant knowledge by directly doing information interactions on complete graphs, which differs from Local Graph Transformers interacting on the original structures. However, we find that most prior works focus only on graph-level tasks (e.g., graph classification) and few Graph Transformer models can effectively solve node-level tasks, especially semi-supervised node classification, which obviously has important practical significance due to the limitation and expensiveness of these node labels. In order to fill this gap, this paper first summarizes the theoretical advantages of Graph Transformers. And based on some exploring experiments, we give some discussions on the main cause of their poor practical performance in semi-supervised node classifications. Secondly, based on this analysis, we design a three-stage homogeneity augmentation framework and propose a Semi-Global Graph Transformer. Considering both global and local perspectives, the proposed model combines various technologies including self-distillation, pseudo-label filtering, pre-training and fine-tuning, and metric learning. Furthermore, it simultaneously enhances the structure and the optimization, improving its effectiveness, scalability, and generalizability. Finally, extensive experiments on seven public homogeneous and heterophilous graph benchmarks show that the proposed method can achieve competitive or much better results compared to many baseline models including state-of-the-arts.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Note to Non-adaptive Broadcasting Semi-Supervised Node Classification via Semi-Global Graph Transformer Based on Homogeneity Augmentation 4-Free Strong Digraphs with the Maximum Size Relation-aware Graph Contrastive Learning The Normalized Laplacian Spectrum of Folded Hypercube with Applications
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1