Sunbeom Kwon, Susu Zhang, Hans Friedrich Köhn, Bo Zhang
{"title":"潜变量建模中的 MCMC 停止规则","authors":"Sunbeom Kwon, Susu Zhang, Hans Friedrich Köhn, Bo Zhang","doi":"10.1111/bmsp.12357","DOIUrl":null,"url":null,"abstract":"<p><p>Bayesian analysis relies heavily on the Markov chain Monte Carlo (MCMC) algorithm to obtain random samples from posterior distributions. In this study, we compare the performance of MCMC stopping rules and provide a guideline for determining the termination point of the MCMC algorithm in latent variable models. In simulation studies, we examine the performance of four different MCMC stopping rules: potential scale reduction factor (PSRF), fixed-width stopping rule, Geweke's diagnostic, and effective sample size. Specifically, we evaluate these stopping rules in the context of the DINA model and the bifactor item response theory model, two commonly used latent variable models in educational and psychological measurement. Our simulation study findings suggest that single-chain approaches outperform multiple-chain approaches in terms of item parameter accuracy. However, when it comes to person parameter estimates, the effect of stopping rules diminishes. We caution against relying solely on the univariate PSRF, which is the most popular method, as it may terminate the algorithm prematurely and produce biased item parameter estimates if the cut-off value is not chosen carefully. Our research offers guidance to practitioners on choosing suitable stopping rules to improve the precision of the MCMC algorithm in models involving latent variables.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2024-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MCMC stopping rules in latent variable modelling.\",\"authors\":\"Sunbeom Kwon, Susu Zhang, Hans Friedrich Köhn, Bo Zhang\",\"doi\":\"10.1111/bmsp.12357\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Bayesian analysis relies heavily on the Markov chain Monte Carlo (MCMC) algorithm to obtain random samples from posterior distributions. In this study, we compare the performance of MCMC stopping rules and provide a guideline for determining the termination point of the MCMC algorithm in latent variable models. In simulation studies, we examine the performance of four different MCMC stopping rules: potential scale reduction factor (PSRF), fixed-width stopping rule, Geweke's diagnostic, and effective sample size. Specifically, we evaluate these stopping rules in the context of the DINA model and the bifactor item response theory model, two commonly used latent variable models in educational and psychological measurement. Our simulation study findings suggest that single-chain approaches outperform multiple-chain approaches in terms of item parameter accuracy. However, when it comes to person parameter estimates, the effect of stopping rules diminishes. We caution against relying solely on the univariate PSRF, which is the most popular method, as it may terminate the algorithm prematurely and produce biased item parameter estimates if the cut-off value is not chosen carefully. Our research offers guidance to practitioners on choosing suitable stopping rules to improve the precision of the MCMC algorithm in models involving latent variables.</p>\",\"PeriodicalId\":55322,\"journal\":{\"name\":\"British Journal of Mathematical & Statistical Psychology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2024-10-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"British Journal of Mathematical & Statistical Psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1111/bmsp.12357\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Journal of Mathematical & Statistical Psychology","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1111/bmsp.12357","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Bayesian analysis relies heavily on the Markov chain Monte Carlo (MCMC) algorithm to obtain random samples from posterior distributions. In this study, we compare the performance of MCMC stopping rules and provide a guideline for determining the termination point of the MCMC algorithm in latent variable models. In simulation studies, we examine the performance of four different MCMC stopping rules: potential scale reduction factor (PSRF), fixed-width stopping rule, Geweke's diagnostic, and effective sample size. Specifically, we evaluate these stopping rules in the context of the DINA model and the bifactor item response theory model, two commonly used latent variable models in educational and psychological measurement. Our simulation study findings suggest that single-chain approaches outperform multiple-chain approaches in terms of item parameter accuracy. However, when it comes to person parameter estimates, the effect of stopping rules diminishes. We caution against relying solely on the univariate PSRF, which is the most popular method, as it may terminate the algorithm prematurely and produce biased item parameter estimates if the cut-off value is not chosen carefully. Our research offers guidance to practitioners on choosing suitable stopping rules to improve the precision of the MCMC algorithm in models involving latent variables.
期刊介绍:
The British Journal of Mathematical and Statistical Psychology publishes articles relating to areas of psychology which have a greater mathematical or statistical aspect of their argument than is usually acceptable to other journals including:
• mathematical psychology
• statistics
• psychometrics
• decision making
• psychophysics
• classification
• relevant areas of mathematics, computing and computer software
These include articles that address substantitive psychological issues or that develop and extend techniques useful to psychologists. New models for psychological processes, new approaches to existing data, critiques of existing models and improved algorithms for estimating the parameters of a model are examples of articles which may be favoured.