{"title":"GraKerformer: A Transformer With Graph Kernel for Unsupervised Graph Representation Learning","authors":"Lixiang Xu;Haifeng Liu;Xin Yuan;Enhong Chen;Yuanyan Tang","doi":"10.1109/TCYB.2024.3465213","DOIUrl":null,"url":null,"abstract":"While highly influential in deep learning, especially in natural language processing, the Transformer model has not exhibited competitive performance in unsupervised graph representation learning (UGRL). Conventional approaches, which focus on local substructures on the graph, offer simplicity but often fall short in encapsulating comprehensive structural information of the graph. This deficiency leads to suboptimal generalization performance. To address this, we proposed the GraKerformer model, a variant of the standard Transformer architecture, to mitigate the shortfall in structural information representation and enhance the performance in UGRL. By leveraging the shortest-path graph kernel (SPGK) to weight attention scores and combining graph neural networks, the GraKerformer effectively encodes the nuanced structural information of graphs. We conducted evaluations on the benchmark datasets for graph classification to validate the superior performance of our approach.","PeriodicalId":13112,"journal":{"name":"IEEE Transactions on Cybernetics","volume":"54 12","pages":"7320-7332"},"PeriodicalIF":9.4000,"publicationDate":"2024-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Cybernetics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10707645/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
While highly influential in deep learning, especially in natural language processing, the Transformer model has not exhibited competitive performance in unsupervised graph representation learning (UGRL). Conventional approaches, which focus on local substructures on the graph, offer simplicity but often fall short in encapsulating comprehensive structural information of the graph. This deficiency leads to suboptimal generalization performance. To address this, we proposed the GraKerformer model, a variant of the standard Transformer architecture, to mitigate the shortfall in structural information representation and enhance the performance in UGRL. By leveraging the shortest-path graph kernel (SPGK) to weight attention scores and combining graph neural networks, the GraKerformer effectively encodes the nuanced structural information of graphs. We conducted evaluations on the benchmark datasets for graph classification to validate the superior performance of our approach.
期刊介绍:
The scope of the IEEE Transactions on Cybernetics includes computational approaches to the field of cybernetics. Specifically, the transactions welcomes papers on communication and control across machines or machine, human, and organizations. The scope includes such areas as computational intelligence, computer vision, neural networks, genetic algorithms, machine learning, fuzzy systems, cognitive systems, decision making, and robotics, to the extent that they contribute to the theme of cybernetics or demonstrate an application of cybernetics principles.