{"title":"局部薛定谔桥采样器","authors":"Georg A. Gottwald, Sebastian Reich","doi":"arxiv-2409.07968","DOIUrl":null,"url":null,"abstract":"We consider the generative problem of sampling from an unknown distribution\nfor which only a sufficiently large number of training samples are available.\nIn this paper, we build on previous work combining Schr\\\"odinger bridges and\nLangevin dynamics. A key bottleneck of this approach is the exponential\ndependence of the required training samples on the dimension, $d$, of the\nambient state space. We propose a localization strategy which exploits\nconditional independence of conditional expectation values. Localization thus\nreplaces a single high-dimensional Schr\\\"odinger bridge problem by $d$\nlow-dimensional Schr\\\"odinger bridge problems over the available training\nsamples. As for the original approach, the localized sampler is stable and\ngeometric ergodic. The sampler also naturally extends to conditional sampling\nand to Bayesian inference. We demonstrate the performance of our proposed\nscheme through experiments on a Gaussian problem with increasing dimensions and\non a stochastic subgrid-scale parametrization conditional sampling problem.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":"58 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Localized Schrödinger Bridge Sampler\",\"authors\":\"Georg A. Gottwald, Sebastian Reich\",\"doi\":\"arxiv-2409.07968\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We consider the generative problem of sampling from an unknown distribution\\nfor which only a sufficiently large number of training samples are available.\\nIn this paper, we build on previous work combining Schr\\\\\\\"odinger bridges and\\nLangevin dynamics. A key bottleneck of this approach is the exponential\\ndependence of the required training samples on the dimension, $d$, of the\\nambient state space. We propose a localization strategy which exploits\\nconditional independence of conditional expectation values. Localization thus\\nreplaces a single high-dimensional Schr\\\\\\\"odinger bridge problem by $d$\\nlow-dimensional Schr\\\\\\\"odinger bridge problems over the available training\\nsamples. As for the original approach, the localized sampler is stable and\\ngeometric ergodic. The sampler also naturally extends to conditional sampling\\nand to Bayesian inference. We demonstrate the performance of our proposed\\nscheme through experiments on a Gaussian problem with increasing dimensions and\\non a stochastic subgrid-scale parametrization conditional sampling problem.\",\"PeriodicalId\":501340,\"journal\":{\"name\":\"arXiv - STAT - Machine Learning\",\"volume\":\"58 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07968\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07968","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We consider the generative problem of sampling from an unknown distribution
for which only a sufficiently large number of training samples are available.
In this paper, we build on previous work combining Schr\"odinger bridges and
Langevin dynamics. A key bottleneck of this approach is the exponential
dependence of the required training samples on the dimension, $d$, of the
ambient state space. We propose a localization strategy which exploits
conditional independence of conditional expectation values. Localization thus
replaces a single high-dimensional Schr\"odinger bridge problem by $d$
low-dimensional Schr\"odinger bridge problems over the available training
samples. As for the original approach, the localized sampler is stable and
geometric ergodic. The sampler also naturally extends to conditional sampling
and to Bayesian inference. We demonstrate the performance of our proposed
scheme through experiments on a Gaussian problem with increasing dimensions and
on a stochastic subgrid-scale parametrization conditional sampling problem.