T. Snelleman, B. M. Renting, H. H. Hoos, J. N. van Rijn
{"title":"基于边的图形组件池","authors":"T. Snelleman, B. M. Renting, H. H. Hoos, J. N. van Rijn","doi":"arxiv-2409.11856","DOIUrl":null,"url":null,"abstract":"Graph-structured data naturally occurs in many research fields, such as\nchemistry and sociology. The relational information contained therein can be\nleveraged to statistically model graph properties through geometrical deep\nlearning. Graph neural networks employ techniques, such as message-passing\nlayers, to propagate local features through a graph. However, message-passing\nlayers can be computationally expensive when dealing with large and sparse\ngraphs. Graph pooling operators offer the possibility of removing or merging\nnodes in such graphs, thus lowering computational costs. However, pooling\noperators that remove nodes cause data loss, and pooling operators that merge\nnodes are often computationally expensive. We propose a pooling operator that\nmerges nodes so as not to cause data loss but is also conceptually simple and\ncomputationally inexpensive. We empirically demonstrate that the proposed\npooling operator performs statistically significantly better than edge pool on\nfour popular benchmark datasets while reducing time complexity and the number\nof trainable parameters by 70.6% on average. Compared to another maximally\npowerful method named Graph Isomporhic Network, we show that we outperform them\non two popular benchmark datasets while reducing the number of learnable\nparameters on average by 60.9%.","PeriodicalId":501301,"journal":{"name":"arXiv - CS - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Edge-Based Graph Component Pooling\",\"authors\":\"T. Snelleman, B. M. Renting, H. H. Hoos, J. N. van Rijn\",\"doi\":\"arxiv-2409.11856\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Graph-structured data naturally occurs in many research fields, such as\\nchemistry and sociology. The relational information contained therein can be\\nleveraged to statistically model graph properties through geometrical deep\\nlearning. Graph neural networks employ techniques, such as message-passing\\nlayers, to propagate local features through a graph. However, message-passing\\nlayers can be computationally expensive when dealing with large and sparse\\ngraphs. Graph pooling operators offer the possibility of removing or merging\\nnodes in such graphs, thus lowering computational costs. However, pooling\\noperators that remove nodes cause data loss, and pooling operators that merge\\nnodes are often computationally expensive. We propose a pooling operator that\\nmerges nodes so as not to cause data loss but is also conceptually simple and\\ncomputationally inexpensive. We empirically demonstrate that the proposed\\npooling operator performs statistically significantly better than edge pool on\\nfour popular benchmark datasets while reducing time complexity and the number\\nof trainable parameters by 70.6% on average. Compared to another maximally\\npowerful method named Graph Isomporhic Network, we show that we outperform them\\non two popular benchmark datasets while reducing the number of learnable\\nparameters on average by 60.9%.\",\"PeriodicalId\":501301,\"journal\":{\"name\":\"arXiv - CS - Machine Learning\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.11856\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11856","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Graph-structured data naturally occurs in many research fields, such as
chemistry and sociology. The relational information contained therein can be
leveraged to statistically model graph properties through geometrical deep
learning. Graph neural networks employ techniques, such as message-passing
layers, to propagate local features through a graph. However, message-passing
layers can be computationally expensive when dealing with large and sparse
graphs. Graph pooling operators offer the possibility of removing or merging
nodes in such graphs, thus lowering computational costs. However, pooling
operators that remove nodes cause data loss, and pooling operators that merge
nodes are often computationally expensive. We propose a pooling operator that
merges nodes so as not to cause data loss but is also conceptually simple and
computationally inexpensive. We empirically demonstrate that the proposed
pooling operator performs statistically significantly better than edge pool on
four popular benchmark datasets while reducing time complexity and the number
of trainable parameters by 70.6% on average. Compared to another maximally
powerful method named Graph Isomporhic Network, we show that we outperform them
on two popular benchmark datasets while reducing the number of learnable
parameters on average by 60.9%.