Investigating Approaches to Controlling Item Position Effects in Computerized Adaptive Tests

IF 1.9 4区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH Educational Measurement-Issues and Practice Pub Date : 2024-10-27 DOI:10.1111/emip.12637
Ye Ma, Deborah J. Harris
{"title":"Investigating Approaches to Controlling Item Position Effects in Computerized Adaptive Tests","authors":"Ye Ma,&nbsp;Deborah J. Harris","doi":"10.1111/emip.12637","DOIUrl":null,"url":null,"abstract":"<p>Item position effect (IPE) refers to situations where an item performs differently when it is administered in different positions on a test. The majority of previous research studies have focused on investigating IPE under linear testing. There is a lack of IPE research under adaptive testing. In addition, the existence of IPE might violate Item Response Theory (IRT)’s item parameter invariance assumption, which facilitates applications of IRT in various psychometric tasks such as computerized adaptive testing (CAT). Ignoring IPE might lead to issues such as inaccurate ability estimation in CAT. This article extends research on IPE by proposing and evaluating approaches to controlling position effects under an item-level computerized adaptive test via a simulation study. The results show that adjusting IPE via a pretesting design (approach 3) or a pool design (approach 4) results in better ability estimation accuracy compared to no adjustment (baseline approach) and item-level adjustment (approach 2). Practical implications of each approach as well as future research directions are discussed as well.</p>","PeriodicalId":47345,"journal":{"name":"Educational Measurement-Issues and Practice","volume":"44 1","pages":"44-54"},"PeriodicalIF":1.9000,"publicationDate":"2024-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Measurement-Issues and Practice","FirstCategoryId":"95","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/emip.12637","RegionNum":4,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

Item position effect (IPE) refers to situations where an item performs differently when it is administered in different positions on a test. The majority of previous research studies have focused on investigating IPE under linear testing. There is a lack of IPE research under adaptive testing. In addition, the existence of IPE might violate Item Response Theory (IRT)’s item parameter invariance assumption, which facilitates applications of IRT in various psychometric tasks such as computerized adaptive testing (CAT). Ignoring IPE might lead to issues such as inaccurate ability estimation in CAT. This article extends research on IPE by proposing and evaluating approaches to controlling position effects under an item-level computerized adaptive test via a simulation study. The results show that adjusting IPE via a pretesting design (approach 3) or a pool design (approach 4) results in better ability estimation accuracy compared to no adjustment (baseline approach) and item-level adjustment (approach 2). Practical implications of each approach as well as future research directions are discussed as well.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
计算机化自适应测试中控制项目位置效应的方法研究
项目位置效应(IPE)是指当一个项目在测试中的不同位置进行管理时,其表现不同的情况。以往的研究大多集中在线性测试下的IPE研究。适应性测试下的IPE研究缺乏。此外,IPE的存在可能违反了项目反应理论(IRT)的项目参数不变性假设,这有利于IRT在计算机化自适应测试(CAT)等各种心理测量任务中的应用。忽略IPE可能会导致诸如在CAT中不准确的能力估计等问题。本文扩展了IPE的研究,通过模拟研究提出并评估了在项目级计算机自适应测试下控制位置效应的方法。结果表明,与不调整(基线法)和项目水平调整(方法2)相比,通过预测设计(方法3)或池设计(方法4)调整IPE的能力估计精度更高。并讨论了每种方法的实际意义以及未来的研究方向。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
3.90
自引率
15.00%
发文量
47
期刊最新文献
Classroom Assessment Validation: Proficiency Claims and Uses The Sensitivity of Value-Added Estimates to Test Scoring Decisions Issue Information AI-Generated Essays: Characteristics and Implications on Automated Scoring and Academic Integrity Issue Information
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1