The Impact of Cheating on Score Comparability via Pool-Based IRT Pre-equating

IF 1.4 4区 心理学 Q3 PSYCHOLOGY, APPLIED Journal of Educational Measurement Pub Date : 2022-05-01 DOI:10.1111/jedm.12321
Jinghua Liu, Kirk Becker
{"title":"The Impact of Cheating on Score Comparability via Pool-Based IRT Pre-equating","authors":"Jinghua Liu,&nbsp;Kirk Becker","doi":"10.1111/jedm.12321","DOIUrl":null,"url":null,"abstract":"<p>For any testing programs that administer multiple forms across multiple years, maintaining score comparability via equating is essential. With continuous testing and high-stakes results, especially with less secure online administrations, testing programs must consider the potential for cheating on their exams. This study used empirical and simulated data to examine the impact of item exposure and prior knowledge on the estimation of item difficulty and test taker's ability via pool-based IRT preequating. Raw-to-theta transformations were derived from two groups of test takers with and without possible prior knowledge of exposed items, and these were compared to a criterion raw to theta transformation. Results indicated that item exposure has a large impact on item difficulty, not only altering the difficulty of exposed items, but also altering the difficulty of unexposed items. Item exposure makes test takers with prior knowledge appear more able. Further, theta estimation bias for test takers without prior knowledge increases when more test takers with possible prior knowledge are in the calibration population. Score inflation occurs for test takers with and without prior knowledge, especially for those with lower abilities.</p>","PeriodicalId":47871,"journal":{"name":"Journal of Educational Measurement","volume":"59 2","pages":"208-230"},"PeriodicalIF":1.4000,"publicationDate":"2022-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Educational Measurement","FirstCategoryId":"102","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/jedm.12321","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"PSYCHOLOGY, APPLIED","Score":null,"Total":0}
引用次数: 3

Abstract

For any testing programs that administer multiple forms across multiple years, maintaining score comparability via equating is essential. With continuous testing and high-stakes results, especially with less secure online administrations, testing programs must consider the potential for cheating on their exams. This study used empirical and simulated data to examine the impact of item exposure and prior knowledge on the estimation of item difficulty and test taker's ability via pool-based IRT preequating. Raw-to-theta transformations were derived from two groups of test takers with and without possible prior knowledge of exposed items, and these were compared to a criterion raw to theta transformation. Results indicated that item exposure has a large impact on item difficulty, not only altering the difficulty of exposed items, but also altering the difficulty of unexposed items. Item exposure makes test takers with prior knowledge appear more able. Further, theta estimation bias for test takers without prior knowledge increases when more test takers with possible prior knowledge are in the calibration population. Score inflation occurs for test takers with and without prior knowledge, especially for those with lower abilities.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
作弊对分数可比性的影响——基于池的IRT预均衡
对于任何在多年中管理多种形式的考试项目,通过相等来保持分数的可比性是必不可少的。随着持续的考试和高风险的结果,特别是不太安全的在线管理,考试项目必须考虑到考试作弊的可能性。本研究运用实证和模拟数据,通过基于池的IRT预均衡,考察了项目暴露和先验知识对项目难度和考生能力估计的影响。原始到theta的转换是从两组有或没有可能事先了解暴露项目的测试者中得出的,并且将这些与原始到theta转换的标准进行比较。结果表明,项目暴露对项目难度有较大影响,不仅改变了被暴露项目的难度,也改变了未被暴露项目的难度。项目暴露使具有先验知识的考生表现得更有能力。此外,当校准人群中有更多可能具有先验知识的考生时,没有先验知识的考生的theta估计偏差会增加。分数膨胀发生在有或没有先验知识的考生身上,尤其是那些能力较低的考生。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
2.30
自引率
7.70%
发文量
46
期刊介绍: The Journal of Educational Measurement (JEM) publishes original measurement research, provides reviews of measurement publications, and reports on innovative measurement applications. The topics addressed will interest those concerned with the practice of measurement in field settings, as well as be of interest to measurement theorists. In addition to presenting new contributions to measurement theory and practice, JEM also serves as a vehicle for improving educational measurement applications in a variety of settings.
期刊最新文献
Sequential Reservoir Computing for Log File‐Based Behavior Process Data Analyses Issue Information Exploring Latent Constructs through Multimodal Data Analysis Robustness of Item Response Theory Models under the PISA Multistage Adaptive Testing Designs Modeling Nonlinear Effects of Person‐by‐Item Covariates in Explanatory Item Response Models: Exploratory Plots and Modeling Using Smooth Functions
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1