{"title":"A review of challenges and solutions in the design and implementation of deep graph neural networks","authors":"Aafaq Mohi ud din, Shaima Qureshi","doi":"10.1080/1206212X.2022.2133805","DOIUrl":null,"url":null,"abstract":"The study of graph neural networks has revealed that they can unleash new applications in a variety of disciplines using such a basic process that we cannot imagine in the context of other deep learning designs. Many limitations limit their expressiveness, and researchers are working to overcome them to fully exploit the power of graph data. There are a number of publications that explore graph neural networks (GNNs) restrictions and bottlenecks, but the common thread that runs through them all is that they can all be traced back to message passing, which is the key technique we use to train our graph models. We outline the general GNN design pipeline in this study as well as discuss solutions to the over-smoothing problem, categorize the solutions, and identify open challenges for further research. Abbreviations: CGNN: Continuous Graph Neural Networks; CNN: Convolution NeuralNetwork; DeGNN: Decomposition Graph Neural Network; DGN: Directional GraphNetworks; DGN: Differentiable Group Normalization; DL: Deep Learning; EGAI:Enhancing GNNs by a High-quality Aggregation of Beneficial Information; GAT: GraphAttention Network; GCN: Graph Convolutional Network; GDC: Graph Drop Connect; GDR: Group Distance Ratio; GNN: Graph Neural Network; GRAND: GraphRandom Neural Networks; IIG: Instance Information Gain; MAD: Man AverageDistance; PDE-GCN: Partial Differential Equations-GCN; PTDNet: ParameterizedTopological Denoising network; TDGNN: Tree Decomposition Graph NeuralNetwork;","PeriodicalId":39673,"journal":{"name":"International Journal of Computers and Applications","volume":"105 1","pages":"221 - 230"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computers and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/1206212X.2022.2133805","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 1
Abstract
The study of graph neural networks has revealed that they can unleash new applications in a variety of disciplines using such a basic process that we cannot imagine in the context of other deep learning designs. Many limitations limit their expressiveness, and researchers are working to overcome them to fully exploit the power of graph data. There are a number of publications that explore graph neural networks (GNNs) restrictions and bottlenecks, but the common thread that runs through them all is that they can all be traced back to message passing, which is the key technique we use to train our graph models. We outline the general GNN design pipeline in this study as well as discuss solutions to the over-smoothing problem, categorize the solutions, and identify open challenges for further research. Abbreviations: CGNN: Continuous Graph Neural Networks; CNN: Convolution NeuralNetwork; DeGNN: Decomposition Graph Neural Network; DGN: Directional GraphNetworks; DGN: Differentiable Group Normalization; DL: Deep Learning; EGAI:Enhancing GNNs by a High-quality Aggregation of Beneficial Information; GAT: GraphAttention Network; GCN: Graph Convolutional Network; GDC: Graph Drop Connect; GDR: Group Distance Ratio; GNN: Graph Neural Network; GRAND: GraphRandom Neural Networks; IIG: Instance Information Gain; MAD: Man AverageDistance; PDE-GCN: Partial Differential Equations-GCN; PTDNet: ParameterizedTopological Denoising network; TDGNN: Tree Decomposition Graph NeuralNetwork;
期刊介绍:
The International Journal of Computers and Applications (IJCA) is a unique platform for publishing novel ideas, research outcomes and fundamental advances in all aspects of Computer Science, Computer Engineering, and Computer Applications. This is a peer-reviewed international journal with a vision to provide the academic and industrial community a platform for presenting original research ideas and applications. IJCA welcomes four special types of papers in addition to the regular research papers within its scope: (a) Papers for which all results could be easily reproducible. For such papers, the authors will be asked to upload "instructions for reproduction'''', possibly with the source codes or stable URLs (from where the codes could be downloaded). (b) Papers with negative results. For such papers, the experimental setting and negative results must be presented in detail. Also, why the negative results are important for the research community must be explained clearly. The rationale behind this kind of paper is that this would help researchers choose the correct approaches to solve problems and avoid the (already worked out) failed approaches. (c) Detailed report, case study and literature review articles about innovative software / hardware, new technology, high impact computer applications and future development with sufficient background and subject coverage. (d) Special issue papers focussing on a particular theme with significant importance or papers selected from a relevant conference with sufficient improvement and new material to differentiate from the papers published in a conference proceedings.