{"title":"The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study","authors":"","doi":"10.1016/j.stueduc.2024.101401","DOIUrl":null,"url":null,"abstract":"<div><p>When construct-irrelevant sources affect item difficulty, validity of the assessment is compromised. Using responses of 260000 students from 71 countries to the Programme for International Student Assessment (PISA) 2018 mathematics assessment and cross-classified mixed effects models, we examined three validity concerns associated with the construct-irrelevant factor, item format: whether the format influenced item difficulty, whether item format’s impact on difficulty varied across countries, undermining PISA’s foundational goal of meaningful country comparisons, and whether item format effects differed between genders, affecting assessment fairness. Item format contributed to a substantial average of 12 % of variance in item difficulties. The effect of item format was non-uniform across countries, with 30 % of the variance in item difficulties being due to format in lower-performing countries, and 10 % in higher-performing countries, challenging the comparability of educational outcomes. The impact of gender on item format differences was minor. Implications for secondary research and assessment design are discussed.</p></div>","PeriodicalId":47539,"journal":{"name":"Studies in Educational Evaluation","volume":null,"pages":null},"PeriodicalIF":2.6000,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0191491X24000804/pdfft?md5=508c28bd0233e2005407a3aeaf5ccee2&pid=1-s2.0-S0191491X24000804-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Studies in Educational Evaluation","FirstCategoryId":"95","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0191491X24000804","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0
Abstract
When construct-irrelevant sources affect item difficulty, validity of the assessment is compromised. Using responses of 260000 students from 71 countries to the Programme for International Student Assessment (PISA) 2018 mathematics assessment and cross-classified mixed effects models, we examined three validity concerns associated with the construct-irrelevant factor, item format: whether the format influenced item difficulty, whether item format’s impact on difficulty varied across countries, undermining PISA’s foundational goal of meaningful country comparisons, and whether item format effects differed between genders, affecting assessment fairness. Item format contributed to a substantial average of 12 % of variance in item difficulties. The effect of item format was non-uniform across countries, with 30 % of the variance in item difficulties being due to format in lower-performing countries, and 10 % in higher-performing countries, challenging the comparability of educational outcomes. The impact of gender on item format differences was minor. Implications for secondary research and assessment design are discussed.
期刊介绍:
Studies in Educational Evaluation publishes original reports of evaluation studies. Four types of articles are published by the journal: (a) Empirical evaluation studies representing evaluation practice in educational systems around the world; (b) Theoretical reflections and empirical studies related to issues involved in the evaluation of educational programs, educational institutions, educational personnel and student assessment; (c) Articles summarizing the state-of-the-art concerning specific topics in evaluation in general or in a particular country or group of countries; (d) Book reviews and brief abstracts of evaluation studies.