{"title":"Variational Potential Flow: A Novel Probabilistic Framework for Energy-Based Generative Modelling","authors":"Junn Yong Loo, Michelle Adeline, Arghya Pal, Vishnu Monn Baskaran, Chee-Ming Ting, Raphael C. -W. Phan","doi":"arxiv-2407.15238","DOIUrl":null,"url":null,"abstract":"Energy based models (EBMs) are appealing for their generality and simplicity\nin data likelihood modeling, but have conventionally been difficult to train\ndue to the unstable and time-consuming implicit MCMC sampling during\ncontrastive divergence training. In this paper, we present a novel energy-based\ngenerative framework, Variational Potential Flow (VAPO), that entirely\ndispenses with implicit MCMC sampling and does not rely on complementary latent\nmodels or cooperative training. The VAPO framework aims to learn a potential\nenergy function whose gradient (flow) guides the prior samples, so that their\ndensity evolution closely follows an approximate data likelihood homotopy. An\nenergy loss function is then formulated to minimize the Kullback-Leibler\ndivergence between density evolution of the flow-driven prior and the data\nlikelihood homotopy. Images can be generated after training the potential\nenergy, by initializing the samples from Gaussian prior and solving the ODE\ngoverning the potential flow on a fixed time interval using generic ODE\nsolvers. Experiment results show that the proposed VAPO framework is capable of\ngenerating realistic images on various image datasets. In particular, our\nproposed framework achieves competitive FID scores for unconditional image\ngeneration on the CIFAR-10 and CelebA datasets.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"65 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.15238","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Energy based models (EBMs) are appealing for their generality and simplicity
in data likelihood modeling, but have conventionally been difficult to train
due to the unstable and time-consuming implicit MCMC sampling during
contrastive divergence training. In this paper, we present a novel energy-based
generative framework, Variational Potential Flow (VAPO), that entirely
dispenses with implicit MCMC sampling and does not rely on complementary latent
models or cooperative training. The VAPO framework aims to learn a potential
energy function whose gradient (flow) guides the prior samples, so that their
density evolution closely follows an approximate data likelihood homotopy. An
energy loss function is then formulated to minimize the Kullback-Leibler
divergence between density evolution of the flow-driven prior and the data
likelihood homotopy. Images can be generated after training the potential
energy, by initializing the samples from Gaussian prior and solving the ODE
governing the potential flow on a fixed time interval using generic ODE
solvers. Experiment results show that the proposed VAPO framework is capable of
generating realistic images on various image datasets. In particular, our
proposed framework achieves competitive FID scores for unconditional image
generation on the CIFAR-10 and CelebA datasets.