IRTrees for skipping items in PIRLS

IF 2.8 3区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH Educational Assessment Evaluation and Accountability Pub Date : 2024-07-24 DOI:10.1007/s11092-024-09439-4
Andrés Christiansen, Rianne Janssen
{"title":"IRTrees for skipping items in PIRLS","authors":"Andrés Christiansen, Rianne Janssen","doi":"10.1007/s11092-024-09439-4","DOIUrl":null,"url":null,"abstract":"<p>In international large-scale assessments, students may not be compelled to answer every test item: a student can decide to skip a seemingly difficult item or may drop out before the end of the test is reached. The way these missing responses are treated will affect the estimation of the item difficulty and student ability, and ultimately affect the country’s score. In the Progress in International Reading Literacy Study (PIRLS), incorrect answer substitution is used. This means that skipped and omitted items are treated as incorrect responses. In the present study, the effect of this approach is investigated. The data of 2006, 2011, and 2016 cycles of PIRLS were analyzed using IRTree models in which a sequential tree structure is estimated to model the full response process. Item difficulty, students’ ability, and country means were estimated and compared with results from a Rasch model using the standard PIRLS approach to missing values. Results showed that the IRTree model was able to disentangle the students’ ability and their propensity to skip items, reducing the correlation between ability and the proportion of skipped items in comparison to the Rasch model. Nevertheless, at the country level, the aggregated scores showed no important differences between models for the pooled sample, but some differences within countries across cycles.</p>","PeriodicalId":46725,"journal":{"name":"Educational Assessment Evaluation and Accountability","volume":null,"pages":null},"PeriodicalIF":2.8000,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Assessment Evaluation and Accountability","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1007/s11092-024-09439-4","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

In international large-scale assessments, students may not be compelled to answer every test item: a student can decide to skip a seemingly difficult item or may drop out before the end of the test is reached. The way these missing responses are treated will affect the estimation of the item difficulty and student ability, and ultimately affect the country’s score. In the Progress in International Reading Literacy Study (PIRLS), incorrect answer substitution is used. This means that skipped and omitted items are treated as incorrect responses. In the present study, the effect of this approach is investigated. The data of 2006, 2011, and 2016 cycles of PIRLS were analyzed using IRTree models in which a sequential tree structure is estimated to model the full response process. Item difficulty, students’ ability, and country means were estimated and compared with results from a Rasch model using the standard PIRLS approach to missing values. Results showed that the IRTree model was able to disentangle the students’ ability and their propensity to skip items, reducing the correlation between ability and the proportion of skipped items in comparison to the Rasch model. Nevertheless, at the country level, the aggregated scores showed no important differences between models for the pooled sample, but some differences within countries across cycles.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
跳过 PIRLS 项目的 IRTrees
在国际大规模评估中,学生可能不会被迫回答每一个测试项目:学生可能决定跳过一个看似困难的项目,也可能在测试结束前退出。如何处理这些缺失的回答将影响对项目难度和学生能力的估计,并最终影响国家的得分。在国际阅读能力进展研究(PIRLS)中,使用了错误答案替换法。这意味着跳过和遗漏的项目将被视为错误答案。本研究调查了这种方法的效果。我们使用 IRTree 模型对 2006、2011 和 2016 年 PIRLS 的数据进行了分析。对项目难度、学生能力和国家平均值进行了估计,并与使用标准 PIRLS 方法处理缺失值的 Rasch 模型的结果进行了比较。结果表明,与 Rasch 模型相比,IRTree 模型能够将学生的能力与其跳题倾向区分开来,降低了能力与跳题比例之间的相关性。尽管如此,在国家层面上,总分显示,在汇总样本中,不同模型之间没有重要差异,但在不同周期的国家内部存在一些差异。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Educational Assessment Evaluation and Accountability
Educational Assessment Evaluation and Accountability EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
9.40
自引率
2.60%
发文量
23
期刊介绍: The main objective of this international journal is to advance knowledge and dissemination of research on and about assessment, evaluation and accountability of all kinds and on various levels as well as in all fields of education.  The journal provides readers with an understanding of the rich contextual nature of evaluation, assessment and accountability in education. The journal is theory-oriented and methodology-based and seeks to connect research, policy making and practice.  The journal publishes outstanding empirical works, peer-reviewed by eminent scholars around the world.Aims and Scope in more detail: The main objective of this international journal is to advance knowledge and dissemination of research on and about evaluation, assessment and accountability: - of all kinds (e.g. person, programme, organisation), - on various levels (state, regional, local), - in all fields of education (primary, secondary, higher education/tertiary, as well as non-school sector) and across all different life phases (e.g. adult education/andragogy/Human Resource Management/professional development).The journal provides readers with an understanding of the rich contextual nature of evaluation, assessment and accountability in education. The journal is theory-oriented and methodology-based and seeks to connect research, policy making and practice. Therefore, the journal explores and discusses: -       theories of evaluation, assessment and accountability, -       function, role, aims and purpose of evaluation, assessment and accountability, -       impact of evaluation, assessment and accountability, -       methodology, design and methods of evaluation, assessment and accountability, -       principles, standards and quality of evaluation, assessment and accountability, -       issues of planning, coordinating, conducting, reporting of evaluation, assessment and accountability.The journal also covers the quality of different instruments or procedures or approaches which are used for evaluation, assessment and accountability.The journal only includes research findings from evaluation, assessment and accountability, if the design or approach of it is meta-reflected in the article.The journal publishes outstanding empirical works, peer-reviewed by eminent scholars around the world.
期刊最新文献
IRTrees for skipping items in PIRLS How representative is the Swedish PISA sample? A comparison of PISA and register data Dimensions of teachers’ data literacy: A systematic review of literature from 1990 to 2021 Examining pre-service teachers’ feedback on low- and high-quality written assignments Legitimising capital: parent organisations and their resistance to testing in England
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1