在 MCMC 方法中使用信息复杂性标准选择最合适的贝叶斯层次通用线性模型

IF 1.3 4区 数学 Q1 MATHEMATICS Journal of Mathematics Pub Date : 2024-02-01 DOI:10.1155/2024/1459524
Endris Assen Ebrahim, Mehmet Ali Cengiz, Erol Terzi
{"title":"在 MCMC 方法中使用信息复杂性标准选择最合适的贝叶斯层次通用线性模型","authors":"Endris Assen Ebrahim, Mehmet Ali Cengiz, Erol Terzi","doi":"10.1155/2024/1459524","DOIUrl":null,"url":null,"abstract":"Both frequentist and Bayesian statistics schools have improved statistical tools and model choices for the collected data or measurements. Model selection approaches have advanced due to the difficulty of comparing complicated hierarchical models in which linear predictors vary by grouping variables, and the number of model parameters is not distinct. Many regression model selection criteria are considered, including the maximum likelihood (ML) point estimation of the parameter and the logarithm of the likelihood of the dataset. This paper demonstrates the information complexity (ICOMP), Bayesian deviance information, or the widely applicable information criterion (WAIC) of the BRMS to hierarchical linear models fitted with repeated measures with a simulation and two real data examples. The Fisher information matrix for the Bayesian hierarchical model considering fixed and random parameters under maximizing a posterior estimation is derived. Using Gibbs sampling and Hybrid Hamiltonian Monte Carlo approaches, six different models were fitted for three distinct application datasets. The best-fitted candidate models were identified under each application dataset with the two MCMC approaches. In this case, the Bayesian hierarchical (mixed effect) linear model with random intercepts and random slopes estimated using the Hamiltonian Monte Carlo method best fits the two application datasets. Information complexity (ICOMP) is a better indicator of the best-fitted models than DIC and WAIC. In addition, the information complexity criterion showed that hierarchical models with gradient-based Hamiltonian Monte Carlo estimation are the best fit and have supper convergence relative to the gradient-free Gibbs sampling methods.","PeriodicalId":54214,"journal":{"name":"Journal of Mathematics","volume":"302 1","pages":""},"PeriodicalIF":1.3000,"publicationDate":"2024-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach\",\"authors\":\"Endris Assen Ebrahim, Mehmet Ali Cengiz, Erol Terzi\",\"doi\":\"10.1155/2024/1459524\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Both frequentist and Bayesian statistics schools have improved statistical tools and model choices for the collected data or measurements. Model selection approaches have advanced due to the difficulty of comparing complicated hierarchical models in which linear predictors vary by grouping variables, and the number of model parameters is not distinct. Many regression model selection criteria are considered, including the maximum likelihood (ML) point estimation of the parameter and the logarithm of the likelihood of the dataset. This paper demonstrates the information complexity (ICOMP), Bayesian deviance information, or the widely applicable information criterion (WAIC) of the BRMS to hierarchical linear models fitted with repeated measures with a simulation and two real data examples. The Fisher information matrix for the Bayesian hierarchical model considering fixed and random parameters under maximizing a posterior estimation is derived. Using Gibbs sampling and Hybrid Hamiltonian Monte Carlo approaches, six different models were fitted for three distinct application datasets. The best-fitted candidate models were identified under each application dataset with the two MCMC approaches. In this case, the Bayesian hierarchical (mixed effect) linear model with random intercepts and random slopes estimated using the Hamiltonian Monte Carlo method best fits the two application datasets. Information complexity (ICOMP) is a better indicator of the best-fitted models than DIC and WAIC. In addition, the information complexity criterion showed that hierarchical models with gradient-based Hamiltonian Monte Carlo estimation are the best fit and have supper convergence relative to the gradient-free Gibbs sampling methods.\",\"PeriodicalId\":54214,\"journal\":{\"name\":\"Journal of Mathematics\",\"volume\":\"302 1\",\"pages\":\"\"},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2024-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Mathematics\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1155/2024/1459524\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Mathematics","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1155/2024/1459524","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

摘要

频数统计学派和贝叶斯统计学派都改进了统计工具和收集数据或测量结果的模型选择。由于难以比较复杂的分层模型,其中线性预测因子因变量分组而异,而且模型参数的数量也不尽相同,因此模型选择方法得到了发展。人们考虑了许多回归模型选择标准,包括参数的最大似然(ML)点估计和数据集的似然对数。本文通过一个模拟和两个真实数据实例,展示了贝叶斯信息矩阵的信息复杂度(ICOMP)、贝叶斯偏差信息或广泛适用的信息准则(WAIC)对重复测量拟合分层线性模型的影响。在最大化后验估计的情况下,考虑到固定参数和随机参数,推导出贝叶斯层次模型的费雪信息矩阵。利用吉布斯抽样和混合汉密尔顿蒙特卡洛方法,对三个不同的应用数据集拟合了六个不同的模型。通过这两种 MCMC 方法,确定了每个应用数据集下拟合最佳的候选模型。在这种情况下,使用汉密尔顿蒙特卡洛方法估计的具有随机截距和随机斜率的贝叶斯分层(混合效应)线性模型最适合两个应用数据集。信息复杂度(ICOMP)是比 DIC 和 WAIC 更好的拟合模型指标。此外,信息复杂度标准显示,与无梯度吉布斯抽样方法相比,基于梯度的哈密尔顿蒙特卡洛估计的分层模型拟合效果最好,收敛性更强。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The Best Fit Bayesian Hierarchical Generalized Linear Model Selection Using Information Complexity Criteria in the MCMC Approach
Both frequentist and Bayesian statistics schools have improved statistical tools and model choices for the collected data or measurements. Model selection approaches have advanced due to the difficulty of comparing complicated hierarchical models in which linear predictors vary by grouping variables, and the number of model parameters is not distinct. Many regression model selection criteria are considered, including the maximum likelihood (ML) point estimation of the parameter and the logarithm of the likelihood of the dataset. This paper demonstrates the information complexity (ICOMP), Bayesian deviance information, or the widely applicable information criterion (WAIC) of the BRMS to hierarchical linear models fitted with repeated measures with a simulation and two real data examples. The Fisher information matrix for the Bayesian hierarchical model considering fixed and random parameters under maximizing a posterior estimation is derived. Using Gibbs sampling and Hybrid Hamiltonian Monte Carlo approaches, six different models were fitted for three distinct application datasets. The best-fitted candidate models were identified under each application dataset with the two MCMC approaches. In this case, the Bayesian hierarchical (mixed effect) linear model with random intercepts and random slopes estimated using the Hamiltonian Monte Carlo method best fits the two application datasets. Information complexity (ICOMP) is a better indicator of the best-fitted models than DIC and WAIC. In addition, the information complexity criterion showed that hierarchical models with gradient-based Hamiltonian Monte Carlo estimation are the best fit and have supper convergence relative to the gradient-free Gibbs sampling methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Mathematics
Journal of Mathematics Mathematics-General Mathematics
CiteScore
2.50
自引率
14.30%
发文量
0
期刊介绍: Journal of Mathematics is a broad scope journal that publishes original research articles as well as review articles on all aspects of both pure and applied mathematics. As well as original research, Journal of Mathematics also publishes focused review articles that assess the state of the art, and identify upcoming challenges and promising solutions for the community.
期刊最新文献
An Unconditionally Stable Numerical Method for Space Tempered Fractional Convection-Diffusion Models On the Exterior Degree of a Finite-Dimensional Lie Algebra Study of Hybrid Problems under Exponential Type Fractional-Order Derivatives Hankel Determinants for the Logarithmic Coefficients of a Subclass of Close-to-Star Functions Characterizing Topologically Dense Injective Acts and Their Monoid Connections
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1