{"title":"Estimating marginal likelihoods from the posterior draws through a geometric identity","authors":"Johannes Reichl","doi":"10.1515/mcma-2020-2068","DOIUrl":null,"url":null,"abstract":"Abstract This article develops a new estimator of the marginal likelihood that requires only a sample of the posterior distribution as the input from the analyst. This sample may come from any sampling scheme, such as Gibbs sampling or Metropolis–Hastings sampling. The presented approach can be implemented generically in almost any application of Bayesian modeling and significantly decreases the computational burdens associated with marginal likelihood estimation compared to existing techniques. The functionality of this method is demonstrated in the context of probit and logit regressions, on two mixtures of normals models, and also on a high-dimensional random intercept probit. Simulation results show that the simple approach presented here achieves excellent stability in low-dimensional models, and also clearly outperforms existing methods when the number of coefficients in the model increases.","PeriodicalId":46576,"journal":{"name":"Monte Carlo Methods and Applications","volume":"26 1","pages":"205 - 221"},"PeriodicalIF":0.8000,"publicationDate":"2020-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1515/mcma-2020-2068","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Monte Carlo Methods and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1515/mcma-2020-2068","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 1
Abstract
Abstract This article develops a new estimator of the marginal likelihood that requires only a sample of the posterior distribution as the input from the analyst. This sample may come from any sampling scheme, such as Gibbs sampling or Metropolis–Hastings sampling. The presented approach can be implemented generically in almost any application of Bayesian modeling and significantly decreases the computational burdens associated with marginal likelihood estimation compared to existing techniques. The functionality of this method is demonstrated in the context of probit and logit regressions, on two mixtures of normals models, and also on a high-dimensional random intercept probit. Simulation results show that the simple approach presented here achieves excellent stability in low-dimensional models, and also clearly outperforms existing methods when the number of coefficients in the model increases.