The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study

IF 2.6 2区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH Studies in Educational Evaluation Pub Date : 2024-09-14 DOI:10.1016/j.stueduc.2024.101401
Kseniia Marcq , Erika Jassuly Chalén Donayre , Johan Braeken
{"title":"The role of item format in the PISA 2018 mathematics literacy assessment: A cross-country study","authors":"Kseniia Marcq ,&nbsp;Erika Jassuly Chalén Donayre ,&nbsp;Johan Braeken","doi":"10.1016/j.stueduc.2024.101401","DOIUrl":null,"url":null,"abstract":"<div><p>When construct-irrelevant sources affect item difficulty, validity of the assessment is compromised. Using responses of 260000 students from 71 countries to the Programme for International Student Assessment (PISA) 2018 mathematics assessment and cross-classified mixed effects models, we examined three validity concerns associated with the construct-irrelevant factor, item format: whether the format influenced item difficulty, whether item format’s impact on difficulty varied across countries, undermining PISA’s foundational goal of meaningful country comparisons, and whether item format effects differed between genders, affecting assessment fairness. Item format contributed to a substantial average of 12 % of variance in item difficulties. The effect of item format was non-uniform across countries, with 30 % of the variance in item difficulties being due to format in lower-performing countries, and 10 % in higher-performing countries, challenging the comparability of educational outcomes. The impact of gender on item format differences was minor. Implications for secondary research and assessment design are discussed.</p></div>","PeriodicalId":47539,"journal":{"name":"Studies in Educational Evaluation","volume":"83 ","pages":"Article 101401"},"PeriodicalIF":2.6000,"publicationDate":"2024-09-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0191491X24000804/pdfft?md5=508c28bd0233e2005407a3aeaf5ccee2&pid=1-s2.0-S0191491X24000804-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Studies in Educational Evaluation","FirstCategoryId":"95","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0191491X24000804","RegionNum":2,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 0

Abstract

When construct-irrelevant sources affect item difficulty, validity of the assessment is compromised. Using responses of 260000 students from 71 countries to the Programme for International Student Assessment (PISA) 2018 mathematics assessment and cross-classified mixed effects models, we examined three validity concerns associated with the construct-irrelevant factor, item format: whether the format influenced item difficulty, whether item format’s impact on difficulty varied across countries, undermining PISA’s foundational goal of meaningful country comparisons, and whether item format effects differed between genders, affecting assessment fairness. Item format contributed to a substantial average of 12 % of variance in item difficulties. The effect of item format was non-uniform across countries, with 30 % of the variance in item difficulties being due to format in lower-performing countries, and 10 % in higher-performing countries, challenging the comparability of educational outcomes. The impact of gender on item format differences was minor. Implications for secondary research and assessment design are discussed.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
项目格式在 PISA 2018 数学素养评估中的作用:跨国研究
当与建构无关的因素影响项目难度时,评估的有效性就会受到损害。利用来自 71 个国家的 26 万名学生对 2018 年国际学生评估项目(PISA)数学评估的回答和交叉分类混合效应模型,我们研究了与建构无关因素--项目格式--相关的三个效度问题:格式是否影响项目难度;项目格式对难度的影响是否因国家而异,从而破坏了 PISA 进行有意义的国家比较的基本目标;项目格式的影响是否因性别而异,从而影响评估的公平性。题目格式平均占题目难度差异的 12%。项目格式对各国的影响并不均匀,在成绩较差的国家,项目难度差异的 30% 是由项目格式造成的,而在成绩较好的国家,则为 10%,这对教育成果的可比性提出了挑战。性别对项目格式差异的影响很小。讨论了二次研究和评估设计的意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.90
自引率
6.50%
发文量
90
审稿时长
62 days
期刊介绍: Studies in Educational Evaluation publishes original reports of evaluation studies. Four types of articles are published by the journal: (a) Empirical evaluation studies representing evaluation practice in educational systems around the world; (b) Theoretical reflections and empirical studies related to issues involved in the evaluation of educational programs, educational institutions, educational personnel and student assessment; (c) Articles summarizing the state-of-the-art concerning specific topics in evaluation in general or in a particular country or group of countries; (d) Book reviews and brief abstracts of evaluation studies.
期刊最新文献
Teaching quality and student achievement inequalities in low- and middle-income countries: A hierarchical linear model analysis Do children speaking indigenous and regional languages benefit equally from updated curricula? A report on a longitudinal quasi-experimental pilot study in Central Asia Exploring the impact of student perceptions of Assessment for Learning on intrinsic motivation How are pre-service physical education teachers’ perceptions of educator-created (dis)empowering climates associated with their motivational processes and teaching intention? Online peer feedback training based on self-regulated learning in english as a foreign language writing: Perceived usefulness and students’ engagement
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1