{"title":"利用归一化流量了解后投影效应","authors":"Marco Raveri, Cyrille Doux, Shivam Pandey","doi":"arxiv-2409.09101","DOIUrl":null,"url":null,"abstract":"Many modern applications of Bayesian inference, such as in cosmology, are\nbased on complicated forward models with high-dimensional parameter spaces.\nThis considerably limits the sampling of posterior distributions conditioned on\nobserved data. In turn, this reduces the interpretability of posteriors to\ntheir one- and two-dimensional marginal distributions, when more information is\navailable in the full dimensional distributions. We show how to learn smooth\nand differentiable representations of posterior distributions from their\nsamples using normalizing flows, which we train with an added evidence error\nloss term, to improve accuracy in multiple ways. Motivated by problems from\ncosmology, we implement a robust method to obtain one and two-dimensional\nposterior profiles. These are obtained by optimizing, instead of integrating,\nover other parameters, and are thus less prone than marginals to so-called\nprojection effects. We also demonstrate how this representation provides an\naccurate estimator of the Bayesian evidence, with log error at the 0.2 level,\nallowing accurate model comparison. We test our method on multi-modal mixtures\nof Gaussians up to dimension 32 before applying it to simulated cosmology\nexamples. Our code is publicly available at\nhttps://github.com/mraveri/tensiometer.","PeriodicalId":501163,"journal":{"name":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","volume":"4 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Understanding posterior projection effects with normalizing flows\",\"authors\":\"Marco Raveri, Cyrille Doux, Shivam Pandey\",\"doi\":\"arxiv-2409.09101\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many modern applications of Bayesian inference, such as in cosmology, are\\nbased on complicated forward models with high-dimensional parameter spaces.\\nThis considerably limits the sampling of posterior distributions conditioned on\\nobserved data. In turn, this reduces the interpretability of posteriors to\\ntheir one- and two-dimensional marginal distributions, when more information is\\navailable in the full dimensional distributions. We show how to learn smooth\\nand differentiable representations of posterior distributions from their\\nsamples using normalizing flows, which we train with an added evidence error\\nloss term, to improve accuracy in multiple ways. Motivated by problems from\\ncosmology, we implement a robust method to obtain one and two-dimensional\\nposterior profiles. These are obtained by optimizing, instead of integrating,\\nover other parameters, and are thus less prone than marginals to so-called\\nprojection effects. We also demonstrate how this representation provides an\\naccurate estimator of the Bayesian evidence, with log error at the 0.2 level,\\nallowing accurate model comparison. We test our method on multi-modal mixtures\\nof Gaussians up to dimension 32 before applying it to simulated cosmology\\nexamples. Our code is publicly available at\\nhttps://github.com/mraveri/tensiometer.\",\"PeriodicalId\":501163,\"journal\":{\"name\":\"arXiv - PHYS - Instrumentation and Methods for Astrophysics\",\"volume\":\"4 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Instrumentation and Methods for Astrophysics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.09101\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Instrumentation and Methods for Astrophysics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.09101","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Understanding posterior projection effects with normalizing flows
Many modern applications of Bayesian inference, such as in cosmology, are
based on complicated forward models with high-dimensional parameter spaces.
This considerably limits the sampling of posterior distributions conditioned on
observed data. In turn, this reduces the interpretability of posteriors to
their one- and two-dimensional marginal distributions, when more information is
available in the full dimensional distributions. We show how to learn smooth
and differentiable representations of posterior distributions from their
samples using normalizing flows, which we train with an added evidence error
loss term, to improve accuracy in multiple ways. Motivated by problems from
cosmology, we implement a robust method to obtain one and two-dimensional
posterior profiles. These are obtained by optimizing, instead of integrating,
over other parameters, and are thus less prone than marginals to so-called
projection effects. We also demonstrate how this representation provides an
accurate estimator of the Bayesian evidence, with log error at the 0.2 level,
allowing accurate model comparison. We test our method on multi-modal mixtures
of Gaussians up to dimension 32 before applying it to simulated cosmology
examples. Our code is publicly available at
https://github.com/mraveri/tensiometer.