Item Parameter Drift in Context Questionnaires from International Large-Scale Assessments

IF 1 Q2 SOCIAL SCIENCES, INTERDISCIPLINARY International Journal of Testing Pub Date : 2018-09-14 DOI:10.1080/15305058.2018.1481852
HyeSun Lee, K. Geisinger
{"title":"Item Parameter Drift in Context Questionnaires from International Large-Scale Assessments","authors":"HyeSun Lee, K. Geisinger","doi":"10.1080/15305058.2018.1481852","DOIUrl":null,"url":null,"abstract":"The purpose of the current study was to examine the impact of item parameter drift (IPD) occurring in context questionnaires from an international large-scale assessment and determine the most appropriate way to address IPD. Focusing on the context of psychometric and educational research where scores from context questionnaires composed of polytomous items were employed for the classification of examinees, the current research investigated the impacts of IPD on the estimation of questionnaire scores and classification accuracy with five manipulated factors: the length of a questionnaire, the proportion of items exhibiting IPD, the direction and magnitude of IPD, and three decisions about IPD. The results indicated that the impact of IPD occurring in a short context questionnaire on the accuracy of score estimation and classification of examinees was substantial. The accuracy in classification considerably decreased especially at the lowest and highest categories of a trait. Unlike the recommendation from literature in educational testing, the current study demonstrated that keeping items exhibiting IPD and removing them only for transformation were appropriate when IPD occurred in relatively short context questionnaires. Using 2011 TIMSS data from Iran, an applied example demonstrated the application of provided guidance in making appropriate decisions about IPD.","PeriodicalId":46615,"journal":{"name":"International Journal of Testing","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2018-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/15305058.2018.1481852","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Testing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/15305058.2018.1481852","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"SOCIAL SCIENCES, INTERDISCIPLINARY","Score":null,"Total":0}
引用次数: 2

Abstract

The purpose of the current study was to examine the impact of item parameter drift (IPD) occurring in context questionnaires from an international large-scale assessment and determine the most appropriate way to address IPD. Focusing on the context of psychometric and educational research where scores from context questionnaires composed of polytomous items were employed for the classification of examinees, the current research investigated the impacts of IPD on the estimation of questionnaire scores and classification accuracy with five manipulated factors: the length of a questionnaire, the proportion of items exhibiting IPD, the direction and magnitude of IPD, and three decisions about IPD. The results indicated that the impact of IPD occurring in a short context questionnaire on the accuracy of score estimation and classification of examinees was substantial. The accuracy in classification considerably decreased especially at the lowest and highest categories of a trait. Unlike the recommendation from literature in educational testing, the current study demonstrated that keeping items exhibiting IPD and removing them only for transformation were appropriate when IPD occurred in relatively short context questionnaires. Using 2011 TIMSS data from Iran, an applied example demonstrated the application of provided guidance in making appropriate decisions about IPD.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
国际大型评估问卷中项目参数的漂移
本研究的目的是检验国际大规模评估的情境问卷中项目参数漂移(IPD)的影响,并确定解决IPD的最合适方法。本研究以心理测量和教育研究为背景,采用由多个相似项目组成的情境问卷中的分数对考生进行分类,通过五个操纵因素调查了IPD对问卷分数估计和分类准确性的影响:问卷长度、,显示IPD的项目的比例、IPD的方向和大小,以及关于IPD的三个决策。结果表明,在简短的上下文问卷中出现的IPD对考生的分数估计和分类的准确性有很大的影响。分类的准确性显著下降,尤其是在一个性状的最低和最高类别。与教育测试中文献中的建议不同,当前的研究表明,当IPD发生在相对较短的情境问卷中时,保留显示IPD的项目并仅为转换而删除它们是合适的。利用2011年伊朗TIMSS数据,一个应用实例展示了所提供的指导在IPD相关决策中的应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International Journal of Testing
International Journal of Testing SOCIAL SCIENCES, INTERDISCIPLINARY-
CiteScore
3.60
自引率
11.80%
发文量
13
期刊最新文献
Combining Mokken Scale Analysis with and rasch measurement theory to explore differences in measurement quality between subgroups Examining the construct validity of the MIDUS version of the Multidimensional Personality Questionnaire (MPQ) Where nonresponse is at its loudest: Cross-country and individual differences in item nonresponse across the PISA 2018 student questionnaire The choice between cognitive diagnosis and item response theory: A case study from medical education Beyond group comparisons: Accounting for intersectional sources of bias in international survey measures
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1