加速迭代滤波

IF 0.6 Q4 STATISTICS & PROBABILITY Austrian Journal of Statistics Pub Date : 2023-07-19 DOI:10.17713/ajs.v52i4.1503
D. Nguyen
{"title":"加速迭代滤波","authors":"D. Nguyen","doi":"10.17713/ajs.v52i4.1503","DOIUrl":null,"url":null,"abstract":"Simulation-based inferences have attracted much attention in recent years, as the direct computation of the likelihood function in many real-world problems is difficult or even impossible. Iterated filtering (Ionides, Bretó, and King 2006; Ionides, Bhadra, Atchadé,and King 2011) enables maximization of likelihood function via model perturbations and approximation of the gradient of loglikelihood through sequential Monte Carlo filtering. By an application of Stein’s identity, Doucet, Jacob, and Rubenthaler (2013) developed asecond-order approximation of the gradient of log-likelihood using sequential Monte Carlo smoothing. Based on these gradient approximations, we develop a new algorithm for maximizing the likelihood using the Nesterov accelerated gradient. We adopt the accelerated inexact gradient algorithm (Ghadimi and Lan 2016) to iterated filtering framework, relaxing the unbiased gradient approximation condition. We devise a perturbation policy for iterated filtering, allowing the new algorithm to converge at an optimal rate for both concave and non-concave log-likelihood functions. It is comparable to the recently developed Bayes map iterated filtering approach and outperforms the original iterated filtering approach.","PeriodicalId":51761,"journal":{"name":"Austrian Journal of Statistics","volume":"1 1","pages":""},"PeriodicalIF":0.6000,"publicationDate":"2023-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Accelerated Iterated Filtering\",\"authors\":\"D. Nguyen\",\"doi\":\"10.17713/ajs.v52i4.1503\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Simulation-based inferences have attracted much attention in recent years, as the direct computation of the likelihood function in many real-world problems is difficult or even impossible. Iterated filtering (Ionides, Bretó, and King 2006; Ionides, Bhadra, Atchadé,and King 2011) enables maximization of likelihood function via model perturbations and approximation of the gradient of loglikelihood through sequential Monte Carlo filtering. By an application of Stein’s identity, Doucet, Jacob, and Rubenthaler (2013) developed asecond-order approximation of the gradient of log-likelihood using sequential Monte Carlo smoothing. Based on these gradient approximations, we develop a new algorithm for maximizing the likelihood using the Nesterov accelerated gradient. We adopt the accelerated inexact gradient algorithm (Ghadimi and Lan 2016) to iterated filtering framework, relaxing the unbiased gradient approximation condition. We devise a perturbation policy for iterated filtering, allowing the new algorithm to converge at an optimal rate for both concave and non-concave log-likelihood functions. It is comparable to the recently developed Bayes map iterated filtering approach and outperforms the original iterated filtering approach.\",\"PeriodicalId\":51761,\"journal\":{\"name\":\"Austrian Journal of Statistics\",\"volume\":\"1 1\",\"pages\":\"\"},\"PeriodicalIF\":0.6000,\"publicationDate\":\"2023-07-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Austrian Journal of Statistics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.17713/ajs.v52i4.1503\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Austrian Journal of Statistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.17713/ajs.v52i4.1503","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0

摘要

基于模拟的推理近年来引起了人们的广泛关注,因为在许多现实问题中,直接计算似然函数是困难的,甚至是不可能的。迭代滤波(Ionides, Bretó, and King 2006;Ionides, Bhadra, atchad,and King 2011)通过模型扰动实现似然函数的最大化,并通过顺序蒙特卡罗滤波逼近对数似然梯度。Doucet、Jacob和Rubenthaler(2013)运用Stein恒等式,利用序贯蒙特卡罗平滑开发了对数似然梯度的二阶近似。基于这些梯度近似,我们开发了一种利用Nesterov加速梯度最大化似然的新算法。我们在迭代滤波框架中采用加速不精确梯度算法(Ghadimi and Lan 2016),放宽了无偏梯度逼近条件。我们设计了一种迭代滤波的扰动策略,允许新算法以最优速率收敛于凹和非凹对数似然函数。它可以与最近开发的贝叶斯映射迭代滤波方法相媲美,并且优于原始迭代滤波方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Accelerated Iterated Filtering
Simulation-based inferences have attracted much attention in recent years, as the direct computation of the likelihood function in many real-world problems is difficult or even impossible. Iterated filtering (Ionides, Bretó, and King 2006; Ionides, Bhadra, Atchadé,and King 2011) enables maximization of likelihood function via model perturbations and approximation of the gradient of loglikelihood through sequential Monte Carlo filtering. By an application of Stein’s identity, Doucet, Jacob, and Rubenthaler (2013) developed asecond-order approximation of the gradient of log-likelihood using sequential Monte Carlo smoothing. Based on these gradient approximations, we develop a new algorithm for maximizing the likelihood using the Nesterov accelerated gradient. We adopt the accelerated inexact gradient algorithm (Ghadimi and Lan 2016) to iterated filtering framework, relaxing the unbiased gradient approximation condition. We devise a perturbation policy for iterated filtering, allowing the new algorithm to converge at an optimal rate for both concave and non-concave log-likelihood functions. It is comparable to the recently developed Bayes map iterated filtering approach and outperforms the original iterated filtering approach.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Austrian Journal of Statistics
Austrian Journal of Statistics STATISTICS & PROBABILITY-
CiteScore
1.10
自引率
0.00%
发文量
30
审稿时长
24 weeks
期刊介绍: The Austrian Journal of Statistics is an open-access journal (without any fees) with a long history and is published approximately quarterly by the Austrian Statistical Society. Its general objective is to promote and extend the use of statistical methods in all kind of theoretical and applied disciplines. The Austrian Journal of Statistics is indexed in many data bases, such as Scopus (by Elsevier), Web of Science - ESCI by Clarivate Analytics (formely Thompson & Reuters), DOAJ, Scimago, and many more. The current estimated impact factor (via Publish or Perish) is 0.775, see HERE, or even more indices HERE. Austrian Journal of Statistics ISNN number is 1026597X Original papers and review articles in English will be published in the Austrian Journal of Statistics if judged consistently with these general aims. All papers will be refereed. Special topics sections will appear from time to time. Each section will have as a theme a specialized area of statistical application, theory, or methodology. Technical notes or problems for considerations under Shorter Communications are also invited. A special section is reserved for book reviews.
期刊最新文献
Kernel-based Estimation of Ageing Intensity Function: Properties and Applications Analysis of Count Time Series: A Bayesian GARMA(p, q) Approach On the Type-I Half-logistic Distribution and Related Contributions: A Review Quality and Sensitivity of Composite Indicators for Sustainable Development On the P-value for Members of the Cressie-Read Family of Divergence Statistics
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1