混合格式考试的融合 SDT/IRT 模型

IF 2.1 3区 心理学 Q2 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS Educational and Psychological Measurement Pub Date : 2024-03-28 DOI:10.1177/00131644241235333
Lawrence T. DeCarlo
{"title":"混合格式考试的融合 SDT/IRT 模型","authors":"Lawrence T. DeCarlo","doi":"10.1177/00131644241235333","DOIUrl":null,"url":null,"abstract":"A psychological framework for different types of items commonly used with mixed-format exams is proposed. A choice model based on signal detection theory (SDT) is used for multiple-choice (MC) items, whereas an item response theory (IRT) model is used for open-ended (OE) items. The SDT and IRT models are shown to share a common conceptualization in terms of latent states of “know/don’t know” at the examinee level. This in turn suggests a way to join or “fuse” the models—through the probability of knowing. A general model that fuses the SDT choice model, for MC items, with a generalized sequential logit model, for OE items, is introduced. Fitting SDT and IRT models simultaneously allows one to examine possible differences in psychological processes across the different types of items, to examine the effects of covariates in both models simultaneously, to allow for relations among the model parameters, and likely offers potential estimation benefits. The utility of the approach is illustrated with MC and OE items from large-scale international exams.","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":"40 1","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2024-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Fused SDT/IRT Models for Mixed-Format Exams\",\"authors\":\"Lawrence T. DeCarlo\",\"doi\":\"10.1177/00131644241235333\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A psychological framework for different types of items commonly used with mixed-format exams is proposed. A choice model based on signal detection theory (SDT) is used for multiple-choice (MC) items, whereas an item response theory (IRT) model is used for open-ended (OE) items. The SDT and IRT models are shown to share a common conceptualization in terms of latent states of “know/don’t know” at the examinee level. This in turn suggests a way to join or “fuse” the models—through the probability of knowing. A general model that fuses the SDT choice model, for MC items, with a generalized sequential logit model, for OE items, is introduced. Fitting SDT and IRT models simultaneously allows one to examine possible differences in psychological processes across the different types of items, to examine the effects of covariates in both models simultaneously, to allow for relations among the model parameters, and likely offers potential estimation benefits. The utility of the approach is illustrated with MC and OE items from large-scale international exams.\",\"PeriodicalId\":11502,\"journal\":{\"name\":\"Educational and Psychological Measurement\",\"volume\":\"40 1\",\"pages\":\"\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2024-03-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Educational and Psychological Measurement\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1177/00131644241235333\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational and Psychological Measurement","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/00131644241235333","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

针对混合形式考试中常用的不同类型的题目,提出了一个心理学框架。基于信号检测理论(SDT)的选择模型适用于多项选择(MC)题目,而项目反应理论(IRT)模型则适用于开放式(OE)题目。结果表明,SDT 模型和 IRT 模型在被试者水平上的 "知道/不知道 "潜在状态方面具有共同的概念。这反过来又提出了一种通过 "知道 "的概率来连接或 "融合 "这两种模型的方法。本文介绍了一个通用模型,该模型融合了针对 MC 项目的 SDT 选择模型和针对 OE 项目的广义顺序 logit 模型。同时拟合 SDT 模型和 IRT 模型,可以考察不同类型项目的心理过程可能存在的差异,同时考察两个模型中协变量的影响,考虑模型参数之间的关系,并可能带来潜在的估算优势。我们用大型国际考试中的 MC 和 OE 项目来说明这种方法的实用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Fused SDT/IRT Models for Mixed-Format Exams
A psychological framework for different types of items commonly used with mixed-format exams is proposed. A choice model based on signal detection theory (SDT) is used for multiple-choice (MC) items, whereas an item response theory (IRT) model is used for open-ended (OE) items. The SDT and IRT models are shown to share a common conceptualization in terms of latent states of “know/don’t know” at the examinee level. This in turn suggests a way to join or “fuse” the models—through the probability of knowing. A general model that fuses the SDT choice model, for MC items, with a generalized sequential logit model, for OE items, is introduced. Fitting SDT and IRT models simultaneously allows one to examine possible differences in psychological processes across the different types of items, to examine the effects of covariates in both models simultaneously, to allow for relations among the model parameters, and likely offers potential estimation benefits. The utility of the approach is illustrated with MC and OE items from large-scale international exams.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Educational and Psychological Measurement
Educational and Psychological Measurement 医学-数学跨学科应用
CiteScore
5.50
自引率
7.40%
发文量
49
审稿时长
6-12 weeks
期刊介绍: Educational and Psychological Measurement (EPM) publishes referred scholarly work from all academic disciplines interested in the study of measurement theory, problems, and issues. Theoretical articles address new developments and techniques, and applied articles deal with innovation applications.
期刊最新文献
Discriminant Validity of Interval Response Formats: Investigating the Dimensional Structure of Interval Widths. Novick Meets Bayes: Improving the Assessment of Individual Students in Educational Practice and Research by Capitalizing on Assessors' Prior Beliefs. Differential Item Functioning Effect Size Use for Validity Information. Optimal Number of Replications for Obtaining Stable Dynamic Fit Index Cutoffs. Invariance: What Does Measurement Invariance Allow Us to Claim?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1