Kuijie Zhang;Shanchen Pang;Huahui Yang;Yuanyuan Zhang;Wenhao Wu;Hengxiao Li;Jerry Chun-Wei Lin
{"title":"Convolution Bridge: An Effective Algorithmic Migration Strategy From CNNs to GNNs","authors":"Kuijie Zhang;Shanchen Pang;Huahui Yang;Yuanyuan Zhang;Wenhao Wu;Hengxiao Li;Jerry Chun-Wei Lin","doi":"10.1109/TNNLS.2025.3527501","DOIUrl":null,"url":null,"abstract":"Graph neural networks (GNNs), as a rising star in machine learning, are widely used in relational data models and have achieved outstanding performance in graph tasks. GNN continuously takes inspiration from mature models in other domains such as computer vision and natural language processing to motivate the development of graph algorithms. However, due to the various data structures from different domains, the cross-domain migration of models has to go through a long period of disassembly and reconstruction, which may not yield the desired results. To preserve the excellent properties of convolution and optimize the migration process from convolutional neural networks (CNNs) to GNNs, we propose a convolution bridge. The convolution bridge realizes the data alignment from CNN to GNN, so that the CNN-based model can be efficiently migrated to the graph structure model. To demonstrate the effectiveness of our migration strategy, we migrated the inception module and U-Net architecture from CNNs to GNNs, named GraInc and GraU-Net, for the node-level task and the graph-level task, respectively. Experimental results show that GraInc and GraU-Net are highly competitive compared to the current state-of-the-art models, particularly on dense graph datasets.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 6","pages":"10764-10778"},"PeriodicalIF":8.9000,"publicationDate":"2025-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10843327/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Graph neural networks (GNNs), as a rising star in machine learning, are widely used in relational data models and have achieved outstanding performance in graph tasks. GNN continuously takes inspiration from mature models in other domains such as computer vision and natural language processing to motivate the development of graph algorithms. However, due to the various data structures from different domains, the cross-domain migration of models has to go through a long period of disassembly and reconstruction, which may not yield the desired results. To preserve the excellent properties of convolution and optimize the migration process from convolutional neural networks (CNNs) to GNNs, we propose a convolution bridge. The convolution bridge realizes the data alignment from CNN to GNN, so that the CNN-based model can be efficiently migrated to the graph structure model. To demonstrate the effectiveness of our migration strategy, we migrated the inception module and U-Net architecture from CNNs to GNNs, named GraInc and GraU-Net, for the node-level task and the graph-level task, respectively. Experimental results show that GraInc and GraU-Net are highly competitive compared to the current state-of-the-art models, particularly on dense graph datasets.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.