Zheng Zhao, Ziwei Luo, Jens Sjölund, Thomas B. Schön
{"title":"生成式扩散模型中的条件采样","authors":"Zheng Zhao, Ziwei Luo, Jens Sjölund, Thomas B. Schön","doi":"arxiv-2409.09650","DOIUrl":null,"url":null,"abstract":"Generative diffusions are a powerful class of Monte Carlo samplers that\nleverage bridging Markov processes to approximate complex, high-dimensional\ndistributions, such as those found in image processing and language models.\nDespite their success in these domains, an important open challenge remains:\nextending these techniques to sample from conditional distributions, as\nrequired in, for example, Bayesian inverse problems. In this paper, we present\na comprehensive review of existing computational approaches to conditional\nsampling within generative diffusion models. Specifically, we highlight key\nmethodologies that either utilise the joint distribution, or rely on\n(pre-trained) marginal distributions with explicit likelihoods, to construct\nconditional generative samplers.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"6 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Conditional sampling within generative diffusion models\",\"authors\":\"Zheng Zhao, Ziwei Luo, Jens Sjölund, Thomas B. Schön\",\"doi\":\"arxiv-2409.09650\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Generative diffusions are a powerful class of Monte Carlo samplers that\\nleverage bridging Markov processes to approximate complex, high-dimensional\\ndistributions, such as those found in image processing and language models.\\nDespite their success in these domains, an important open challenge remains:\\nextending these techniques to sample from conditional distributions, as\\nrequired in, for example, Bayesian inverse problems. In this paper, we present\\na comprehensive review of existing computational approaches to conditional\\nsampling within generative diffusion models. Specifically, we highlight key\\nmethodologies that either utilise the joint distribution, or rely on\\n(pre-trained) marginal distributions with explicit likelihoods, to construct\\nconditional generative samplers.\",\"PeriodicalId\":501340,\"journal\":{\"name\":\"arXiv - STAT - Machine Learning\",\"volume\":\"6 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.09650\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.09650","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Conditional sampling within generative diffusion models
Generative diffusions are a powerful class of Monte Carlo samplers that
leverage bridging Markov processes to approximate complex, high-dimensional
distributions, such as those found in image processing and language models.
Despite their success in these domains, an important open challenge remains:
extending these techniques to sample from conditional distributions, as
required in, for example, Bayesian inverse problems. In this paper, we present
a comprehensive review of existing computational approaches to conditional
sampling within generative diffusion models. Specifically, we highlight key
methodologies that either utilise the joint distribution, or rely on
(pre-trained) marginal distributions with explicit likelihoods, to construct
conditional generative samplers.