Sequential Monte Carlo optimization and statistical inference

IF 4.4 2区 数学 Q1 STATISTICS & PROBABILITY Wiley Interdisciplinary Reviews-Computational Statistics Pub Date : 2022-09-20 DOI:10.1002/wics.1598
J. Duan, Shuping Li, Yaxian Xu
{"title":"Sequential Monte Carlo optimization and statistical inference","authors":"J. Duan, Shuping Li, Yaxian Xu","doi":"10.1002/wics.1598","DOIUrl":null,"url":null,"abstract":"Sequential Monte Carlo (SMC) is a powerful technique originally developed for particle filtering and Bayesian inference. As a generic optimizer for statistical and nonstatistical objectives, its role is far less known. Density‐tempered SMC is a highly efficient sampling technique ideally suited for challenging global optimization problems and is implementable with a somewhat arbitrary initialization sampler instead of relying on a prior distribution. SMC optimization is anchored at the fact that all optimization tasks (continuous, discontinuous, combinatorial, or noisy objective function) can be turned into sampling under a density or probability function short of a norming constant. The point with the highest functional value is the SMC estimate for the maximum. Through examples, we systematically present various density‐tempered SMC algorithms and their superior performance vs. other techniques like Markov Chain Monte Carlo. Data cloning and k‐fold duplication are two easily implementable accuracy accelerators, and their complementarity is discussed. The Extreme Value Theorem on the maximum order statistic can also help assess the quality of the SMC optimum. Our coverage includes the algorithmic essence of the density‐tempered SMC with various enhancements and solutions for (1) a bi‐modal nonstatistical function without and with constraints, (2) a multidimensional step function, (3) offline and online optimizations, (4) combinatorial variable selection, and (5) noninvertibility of the Hessian.","PeriodicalId":47779,"journal":{"name":"Wiley Interdisciplinary Reviews-Computational Statistics","volume":null,"pages":null},"PeriodicalIF":4.4000,"publicationDate":"2022-09-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Wiley Interdisciplinary Reviews-Computational Statistics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1002/wics.1598","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0

Abstract

Sequential Monte Carlo (SMC) is a powerful technique originally developed for particle filtering and Bayesian inference. As a generic optimizer for statistical and nonstatistical objectives, its role is far less known. Density‐tempered SMC is a highly efficient sampling technique ideally suited for challenging global optimization problems and is implementable with a somewhat arbitrary initialization sampler instead of relying on a prior distribution. SMC optimization is anchored at the fact that all optimization tasks (continuous, discontinuous, combinatorial, or noisy objective function) can be turned into sampling under a density or probability function short of a norming constant. The point with the highest functional value is the SMC estimate for the maximum. Through examples, we systematically present various density‐tempered SMC algorithms and their superior performance vs. other techniques like Markov Chain Monte Carlo. Data cloning and k‐fold duplication are two easily implementable accuracy accelerators, and their complementarity is discussed. The Extreme Value Theorem on the maximum order statistic can also help assess the quality of the SMC optimum. Our coverage includes the algorithmic essence of the density‐tempered SMC with various enhancements and solutions for (1) a bi‐modal nonstatistical function without and with constraints, (2) a multidimensional step function, (3) offline and online optimizations, (4) combinatorial variable selection, and (5) noninvertibility of the Hessian.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
顺序蒙特卡罗优化和统计推断
序列蒙特卡罗(SMC)是一种强大的技术,最初用于粒子滤波和贝叶斯推理。作为统计和非统计目标的通用优化器,它的作用远不为人所知。密度回火SMC是一种高效的采样技术,非常适合于具有挑战性的全局优化问题,并且可以使用任意的初始化采样器来实现,而不是依赖于先验分布。SMC优化基于这样一个事实,即所有优化任务(连续、不连续、组合或有噪声的目标函数)都可以在密度或概率函数小于规范常数的情况下进行采样。函数值最高的点是最大值的SMC估计值。通过实例,我们系统地介绍了各种密度调和SMC算法及其相对于其他技术(如马尔可夫链蒙特卡罗)的优越性能。数据克隆和k倍复制是两个易于实现的准确性加速器,并讨论了它们的互补性。关于最大阶统计量的极值定理也可以帮助评估SMC最优的质量。我们的覆盖范围包括密度调和SMC的算法本质,以及(1)无约束和有约束的双模非平稳函数,(2)多维阶跃函数,(3)离线和在线优化,(4)组合变量选择,以及(5)Hessian的不可逆性的各种增强和解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.20
自引率
0.00%
发文量
31
期刊最新文献
A spectrum of explainable and interpretable machine learning approaches for genomic studies Functional neuroimaging in the era of Big Data and Open Science: A modern overview Neuroimaging statistical approaches for determining neural correlates of Alzheimer's disease via positron emission tomography imaging Information criteria for model selection Data Integration in Causal Inference.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1