{"title":"A Gibbs Sampling Algorithm with Monotonicity Constraints for Diagnostic Classification Models","authors":"K. Yamaguchi, J. Templin","doi":"10.31234/osf.io/undcv","DOIUrl":null,"url":null,"abstract":"Diagnostic classification models (DCMs) are restricted latent class models with a set of cross-class equality constraints and additional monotonicity constraints on their item parameters, both of which are needed to ensure the meaning of classes and model parameters. In this paper, we develop an efficient, Gibbs sampling-based Bayesian Markov chain Monte Carlo estimation method for general DCMs with monotonicity constraints. A simulation study was conducted to evaluate parameter recovery of the algorithm which showed accurate estimation of model parameters. Moreover, the proposed algorithm was compared to a previously developed Gibbs sampling algorithm which imposed constraints on only the main effect item parameters of the log-linear cognitive diagnosis model. The newly proposed algorithm showed less bias and faster convergence. An analysis of the 2000 Programme for International Student Assessment reading assessment data using this algorithm was also conducted.","PeriodicalId":50241,"journal":{"name":"Journal of Classification","volume":"39 1","pages":"24-54"},"PeriodicalIF":1.8000,"publicationDate":"2020-10-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Classification","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.31234/osf.io/undcv","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 6
Abstract
Diagnostic classification models (DCMs) are restricted latent class models with a set of cross-class equality constraints and additional monotonicity constraints on their item parameters, both of which are needed to ensure the meaning of classes and model parameters. In this paper, we develop an efficient, Gibbs sampling-based Bayesian Markov chain Monte Carlo estimation method for general DCMs with monotonicity constraints. A simulation study was conducted to evaluate parameter recovery of the algorithm which showed accurate estimation of model parameters. Moreover, the proposed algorithm was compared to a previously developed Gibbs sampling algorithm which imposed constraints on only the main effect item parameters of the log-linear cognitive diagnosis model. The newly proposed algorithm showed less bias and faster convergence. An analysis of the 2000 Programme for International Student Assessment reading assessment data using this algorithm was also conducted.
期刊介绍:
To publish original and valuable papers in the field of classification, numerical taxonomy, multidimensional scaling and other ordination techniques, clustering, tree structures and other network models (with somewhat less emphasis on principal components analysis, factor analysis, and discriminant analysis), as well as associated models and algorithms for fitting them. Articles will support advances in methodology while demonstrating compelling substantive applications. Comprehensive review articles are also acceptable. Contributions will represent disciplines such as statistics, psychology, biology, information retrieval, anthropology, archeology, astronomy, business, chemistry, computer science, economics, engineering, geography, geology, linguistics, marketing, mathematics, medicine, political science, psychiatry, sociology, and soil science.