{"title":"Improving Graph Neural Network with Learnable Permutation Pooling","authors":"Yu Jin, J. JáJá","doi":"10.1109/ICDMW58026.2022.00094","DOIUrl":null,"url":null,"abstract":"Graph neural networks (GNN) have achieved great success in various graph-related applications. Most existing graph neural network models follow the message-passing neural network (MPNN) paradigm where the graph pooling function forms a critical component that directly determines the model effectiveness. In this paper, we propose PermPool, a new graph pooling function that provably improves the GNN model expressiveness. The method is based on the insight that the distribution of node permuations, when defined properly, forms characteristic encoding of graphs. We propose to express graph representations as the expectation of node permutations with a general pooling function. We show that the graph representation remains invariant to node-reordering and has strong expressive power than MPNN models. In addition, we propose novel permutation modeling and sampling techniques that integrate PermPool into the differentiable neural network models. Empirical results show that our method outperformed other pooling methods in benchmark graph classification tasks.","PeriodicalId":146687,"journal":{"name":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Data Mining Workshops (ICDMW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDMW58026.2022.00094","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Graph neural networks (GNN) have achieved great success in various graph-related applications. Most existing graph neural network models follow the message-passing neural network (MPNN) paradigm where the graph pooling function forms a critical component that directly determines the model effectiveness. In this paper, we propose PermPool, a new graph pooling function that provably improves the GNN model expressiveness. The method is based on the insight that the distribution of node permuations, when defined properly, forms characteristic encoding of graphs. We propose to express graph representations as the expectation of node permutations with a general pooling function. We show that the graph representation remains invariant to node-reordering and has strong expressive power than MPNN models. In addition, we propose novel permutation modeling and sampling techniques that integrate PermPool into the differentiable neural network models. Empirical results show that our method outperformed other pooling methods in benchmark graph classification tasks.