{"title":"用吉布斯弹跳粒子采样器解决大规模线性逆问题的融合 $L_{1/2}$ 先验","authors":"Xiongwen Ke, Yanan Fan, Qingping Zhou","doi":"arxiv-2409.07874","DOIUrl":null,"url":null,"abstract":"In this paper, we study Bayesian approach for solving large scale linear\ninverse problems arising in various scientific and engineering fields. We\npropose a fused $L_{1/2}$ prior with edge-preserving and sparsity-promoting\nproperties and show that it can be formulated as a Gaussian mixture Markov\nrandom field. Since the density function of this family of prior is neither\nlog-concave nor Lipschitz, gradient-based Markov chain Monte Carlo methods can\nnot be applied to sample the posterior. Thus, we present a Gibbs sampler in\nwhich all the conditional posteriors involved have closed form expressions. The\nGibbs sampler works well for small size problems but it is computationally\nintractable for large scale problems due to the need for sample high\ndimensional Gaussian distribution. To reduce the computation burden, we\nconstruct a Gibbs bouncy particle sampler (Gibbs-BPS) based on a piecewise\ndeterministic Markov process. This new sampler combines elements of Gibbs\nsampler with bouncy particle sampler and its computation complexity is an order\nof magnitude smaller. We show that the new sampler converges to the target\ndistribution. With computed tomography examples, we demonstrate that the\nproposed method shows competitive performance with existing popular Bayesian\nmethods and is highly efficient in large scale problems.","PeriodicalId":501425,"journal":{"name":"arXiv - STAT - Methodology","volume":"23 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fused $L_{1/2}$ prior for large scale linear inverse problem with Gibbs bouncy particle sampler\",\"authors\":\"Xiongwen Ke, Yanan Fan, Qingping Zhou\",\"doi\":\"arxiv-2409.07874\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we study Bayesian approach for solving large scale linear\\ninverse problems arising in various scientific and engineering fields. We\\npropose a fused $L_{1/2}$ prior with edge-preserving and sparsity-promoting\\nproperties and show that it can be formulated as a Gaussian mixture Markov\\nrandom field. Since the density function of this family of prior is neither\\nlog-concave nor Lipschitz, gradient-based Markov chain Monte Carlo methods can\\nnot be applied to sample the posterior. Thus, we present a Gibbs sampler in\\nwhich all the conditional posteriors involved have closed form expressions. The\\nGibbs sampler works well for small size problems but it is computationally\\nintractable for large scale problems due to the need for sample high\\ndimensional Gaussian distribution. To reduce the computation burden, we\\nconstruct a Gibbs bouncy particle sampler (Gibbs-BPS) based on a piecewise\\ndeterministic Markov process. This new sampler combines elements of Gibbs\\nsampler with bouncy particle sampler and its computation complexity is an order\\nof magnitude smaller. We show that the new sampler converges to the target\\ndistribution. With computed tomography examples, we demonstrate that the\\nproposed method shows competitive performance with existing popular Bayesian\\nmethods and is highly efficient in large scale problems.\",\"PeriodicalId\":501425,\"journal\":{\"name\":\"arXiv - STAT - Methodology\",\"volume\":\"23 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - STAT - Methodology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.07874\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Methodology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.07874","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fused $L_{1/2}$ prior for large scale linear inverse problem with Gibbs bouncy particle sampler
In this paper, we study Bayesian approach for solving large scale linear
inverse problems arising in various scientific and engineering fields. We
propose a fused $L_{1/2}$ prior with edge-preserving and sparsity-promoting
properties and show that it can be formulated as a Gaussian mixture Markov
random field. Since the density function of this family of prior is neither
log-concave nor Lipschitz, gradient-based Markov chain Monte Carlo methods can
not be applied to sample the posterior. Thus, we present a Gibbs sampler in
which all the conditional posteriors involved have closed form expressions. The
Gibbs sampler works well for small size problems but it is computationally
intractable for large scale problems due to the need for sample high
dimensional Gaussian distribution. To reduce the computation burden, we
construct a Gibbs bouncy particle sampler (Gibbs-BPS) based on a piecewise
deterministic Markov process. This new sampler combines elements of Gibbs
sampler with bouncy particle sampler and its computation complexity is an order
of magnitude smaller. We show that the new sampler converges to the target
distribution. With computed tomography examples, we demonstrate that the
proposed method shows competitive performance with existing popular Bayesian
methods and is highly efficient in large scale problems.