Distractor Analysis for Multiple-Choice Tests: An Empirical Study With International Language Assessment Data

Q3 Social Sciences ETS Research Report Series Pub Date : 2019-11-06 DOI:10.1002/ets2.12275
Shelby J. Haberman, Yang Liu, Yi-Hsuan Lee
{"title":"Distractor Analysis for Multiple-Choice Tests: An Empirical Study With International Language Assessment Data","authors":"Shelby J. Haberman,&nbsp;Yang Liu,&nbsp;Yi-Hsuan Lee","doi":"10.1002/ets2.12275","DOIUrl":null,"url":null,"abstract":"<p>Distractor analyses are routinely conducted in educational assessments with multiple-choice items. In this research report, we focus on three item response models for distractors: (a) the traditional nominal response (NR) model, (b) a combination of a two-parameter logistic model for item scores and a NR model for selections of incorrect distractors, and (c) a model in which the item score satisfies a two-parameter logistic model and distractor selection and proficiency are conditionally independent, given that an incorrect response is selected. Model comparisons involve generalized residuals, information measures, scale scores, and reliability estimates. To illustrate the methodology, a study of an international assessment of proficiency of nonnative speakers of a single target language used to make high-stakes decisions compares the models under study.</p>","PeriodicalId":11972,"journal":{"name":"ETS Research Report Series","volume":"2019 1","pages":"1-16"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1002/ets2.12275","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ETS Research Report Series","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/ets2.12275","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Social Sciences","Score":null,"Total":0}
引用次数: 2

Abstract

Distractor analyses are routinely conducted in educational assessments with multiple-choice items. In this research report, we focus on three item response models for distractors: (a) the traditional nominal response (NR) model, (b) a combination of a two-parameter logistic model for item scores and a NR model for selections of incorrect distractors, and (c) a model in which the item score satisfies a two-parameter logistic model and distractor selection and proficiency are conditionally independent, given that an incorrect response is selected. Model comparisons involve generalized residuals, information measures, scale scores, and reliability estimates. To illustrate the methodology, a study of an international assessment of proficiency of nonnative speakers of a single target language used to make high-stakes decisions compares the models under study.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
多项选择测试的干扰因素分析:国际语言评估数据的实证研究
干扰因素分析通常在多项选择题的教育评估中进行。在本研究报告中,我们重点研究了三种干扰因素的项目反应模型:(a)传统的名义反应(NR)模型,(b)项目分数的双参数逻辑模型和选择错误干扰因素的NR模型的组合,以及(c)项目分数满足双参数逻辑模型,并且在选择错误反应的情况下,干扰因素的选择和熟练程度是条件独立的模型。模型比较包括广义残差、信息度量、量表得分和可靠性估计。为了说明这一方法,一项对用于做出高风险决策的单一目标语言的非母语人士的熟练程度进行国际评估的研究比较了所研究的模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
ETS Research Report Series
ETS Research Report Series Social Sciences-Education
CiteScore
1.20
自引率
0.00%
发文量
17
期刊最新文献
Issue Information Exploring the Idea of Task in the Context of the Young Language Learner Classroom Monitoring Oral Reading Fluency From Electronic Shared Book Reading: Insights From a Full-Length Book Reading Study With Relay Reader® Charting the Future of Assessments Insights Into Critical Discussion: Designing a Computer-Supported Collaborative Space for Middle Schoolers
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1