{"title":"IRTrees for skipping items in PIRLS","authors":"Andrés Christiansen, Rianne Janssen","doi":"10.1007/s11092-024-09439-4","DOIUrl":null,"url":null,"abstract":"<p>In international large-scale assessments, students may not be compelled to answer every test item: a student can decide to skip a seemingly difficult item or may drop out before the end of the test is reached. The way these missing responses are treated will affect the estimation of the item difficulty and student ability, and ultimately affect the country’s score. In the Progress in International Reading Literacy Study (PIRLS), incorrect answer substitution is used. This means that skipped and omitted items are treated as incorrect responses. In the present study, the effect of this approach is investigated. The data of 2006, 2011, and 2016 cycles of PIRLS were analyzed using IRTree models in which a sequential tree structure is estimated to model the full response process. Item difficulty, students’ ability, and country means were estimated and compared with results from a Rasch model using the standard PIRLS approach to missing values. Results showed that the IRTree model was able to disentangle the students’ ability and their propensity to skip items, reducing the correlation between ability and the proportion of skipped items in comparison to the Rasch model. Nevertheless, at the country level, the aggregated scores showed no important differences between models for the pooled sample, but some differences within countries across cycles.</p>","PeriodicalId":46725,"journal":{"name":"Educational Assessment Evaluation and Accountability","volume":"9 1","pages":""},"PeriodicalIF":2.8000,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational Assessment Evaluation and Accountability","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1007/s11092-024-09439-4","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
In international large-scale assessments, students may not be compelled to answer every test item: a student can decide to skip a seemingly difficult item or may drop out before the end of the test is reached. The way these missing responses are treated will affect the estimation of the item difficulty and student ability, and ultimately affect the country’s score. In the Progress in International Reading Literacy Study (PIRLS), incorrect answer substitution is used. This means that skipped and omitted items are treated as incorrect responses. In the present study, the effect of this approach is investigated. The data of 2006, 2011, and 2016 cycles of PIRLS were analyzed using IRTree models in which a sequential tree structure is estimated to model the full response process. Item difficulty, students’ ability, and country means were estimated and compared with results from a Rasch model using the standard PIRLS approach to missing values. Results showed that the IRTree model was able to disentangle the students’ ability and their propensity to skip items, reducing the correlation between ability and the proportion of skipped items in comparison to the Rasch model. Nevertheless, at the country level, the aggregated scores showed no important differences between models for the pooled sample, but some differences within countries across cycles.
期刊介绍:
The main objective of this international journal is to advance knowledge and dissemination of research on and about assessment, evaluation and accountability of all kinds and on various levels as well as in all fields of education. The journal provides readers with an understanding of the rich contextual nature of evaluation, assessment and accountability in education. The journal is theory-oriented and methodology-based and seeks to connect research, policy making and practice. The journal publishes outstanding empirical works, peer-reviewed by eminent scholars around the world.Aims and Scope in more detail: The main objective of this international journal is to advance knowledge and dissemination of research on and about evaluation, assessment and accountability: - of all kinds (e.g. person, programme, organisation), - on various levels (state, regional, local), - in all fields of education (primary, secondary, higher education/tertiary, as well as non-school sector) and across all different life phases (e.g. adult education/andragogy/Human Resource Management/professional development).The journal provides readers with an understanding of the rich contextual nature of evaluation, assessment and accountability in education. The journal is theory-oriented and methodology-based and seeks to connect research, policy making and practice. Therefore, the journal explores and discusses: - theories of evaluation, assessment and accountability, - function, role, aims and purpose of evaluation, assessment and accountability, - impact of evaluation, assessment and accountability, - methodology, design and methods of evaluation, assessment and accountability, - principles, standards and quality of evaluation, assessment and accountability, - issues of planning, coordinating, conducting, reporting of evaluation, assessment and accountability.The journal also covers the quality of different instruments or procedures or approaches which are used for evaluation, assessment and accountability.The journal only includes research findings from evaluation, assessment and accountability, if the design or approach of it is meta-reflected in the article.The journal publishes outstanding empirical works, peer-reviewed by eminent scholars around the world.