Hang Li, Wei Jin, Geri Skenderi, Harry Shomer, Wenzhuo Tang, Wenqi Fan, Jiliang Tang
{"title":"基于子图的链接预测扩散模型","authors":"Hang Li, Wei Jin, Geri Skenderi, Harry Shomer, Wenzhuo Tang, Wenqi Fan, Jiliang Tang","doi":"arxiv-2409.08487","DOIUrl":null,"url":null,"abstract":"Denoising Diffusion Probabilistic Models (DDPMs) represent a contemporary\nclass of generative models with exceptional qualities in both synthesis and\nmaximizing the data likelihood. These models work by traversing a forward\nMarkov Chain where data is perturbed, followed by a reverse process where a\nneural network learns to undo the perturbations and recover the original data.\nThere have been increasing efforts exploring the applications of DDPMs in the\ngraph domain. However, most of them have focused on the generative perspective.\nIn this paper, we aim to build a novel generative model for link prediction. In\nparticular, we treat link prediction between a pair of nodes as a conditional\nlikelihood estimation of its enclosing sub-graph. With a dedicated design to\ndecompose the likelihood estimation process via the Bayesian formula, we are\nable to separate the estimation of sub-graph structure and its node features.\nSuch designs allow our model to simultaneously enjoy the advantages of\ninductive learning and the strong generalization capability. Remarkably,\ncomprehensive experiments across various datasets validate that our proposed\nmethod presents numerous advantages: (1) transferability across datasets\nwithout retraining, (2) promising generalization on limited training data, and\n(3) robustness against graph adversarial attacks.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Sub-graph Based Diffusion Model for Link Prediction\",\"authors\":\"Hang Li, Wei Jin, Geri Skenderi, Harry Shomer, Wenzhuo Tang, Wenqi Fan, Jiliang Tang\",\"doi\":\"arxiv-2409.08487\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Denoising Diffusion Probabilistic Models (DDPMs) represent a contemporary\\nclass of generative models with exceptional qualities in both synthesis and\\nmaximizing the data likelihood. These models work by traversing a forward\\nMarkov Chain where data is perturbed, followed by a reverse process where a\\nneural network learns to undo the perturbations and recover the original data.\\nThere have been increasing efforts exploring the applications of DDPMs in the\\ngraph domain. However, most of them have focused on the generative perspective.\\nIn this paper, we aim to build a novel generative model for link prediction. In\\nparticular, we treat link prediction between a pair of nodes as a conditional\\nlikelihood estimation of its enclosing sub-graph. With a dedicated design to\\ndecompose the likelihood estimation process via the Bayesian formula, we are\\nable to separate the estimation of sub-graph structure and its node features.\\nSuch designs allow our model to simultaneously enjoy the advantages of\\ninductive learning and the strong generalization capability. Remarkably,\\ncomprehensive experiments across various datasets validate that our proposed\\nmethod presents numerous advantages: (1) transferability across datasets\\nwithout retraining, (2) promising generalization on limited training data, and\\n(3) robustness against graph adversarial attacks.\",\"PeriodicalId\":501340,\"journal\":{\"name\":\"arXiv - STAT - Machine Learning\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.08487\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.08487","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sub-graph Based Diffusion Model for Link Prediction
Denoising Diffusion Probabilistic Models (DDPMs) represent a contemporary
class of generative models with exceptional qualities in both synthesis and
maximizing the data likelihood. These models work by traversing a forward
Markov Chain where data is perturbed, followed by a reverse process where a
neural network learns to undo the perturbations and recover the original data.
There have been increasing efforts exploring the applications of DDPMs in the
graph domain. However, most of them have focused on the generative perspective.
In this paper, we aim to build a novel generative model for link prediction. In
particular, we treat link prediction between a pair of nodes as a conditional
likelihood estimation of its enclosing sub-graph. With a dedicated design to
decompose the likelihood estimation process via the Bayesian formula, we are
able to separate the estimation of sub-graph structure and its node features.
Such designs allow our model to simultaneously enjoy the advantages of
inductive learning and the strong generalization capability. Remarkably,
comprehensive experiments across various datasets validate that our proposed
method presents numerous advantages: (1) transferability across datasets
without retraining, (2) promising generalization on limited training data, and
(3) robustness against graph adversarial attacks.