国际大规模评估中混合用词量表不一致应答者对推论的影响

IF 2.7 3区 教育学 Q1 EDUCATION & EDUCATIONAL RESEARCH Assessment in Education-Principles Policy & Practice Pub Date : 2021-11-22 DOI:10.1080/0969594X.2021.2005302
Isa Steinmann, Daniel Sánchez, Saskia van Laar, J. Braeken
{"title":"国际大规模评估中混合用词量表不一致应答者对推论的影响","authors":"Isa Steinmann, Daniel Sánchez, Saskia van Laar, J. Braeken","doi":"10.1080/0969594X.2021.2005302","DOIUrl":null,"url":null,"abstract":"ABSTRACT Questionnaire scales that are mixed-worded, i.e. include both positively and negatively worded items, often suffer from issues like low reliability and more complex latent structures than intended. Part of the problem might be that some responders fail to respond consistently to the mixed-worded items. We investigated the prevalence and impact of inconsistent responders in 37 primary education systems participating in the joint PIRLS/TIMSS 2011 assessment. Using the mean absolute difference method and three mixed-worded self-concept scales, we identified between 2%‒36% of students as inconsistent responders across education systems. Consistent with expectations, these students showed lower average achievement scores and had a higher risk of being identified as inconsistent on more than one scale. We also found that the inconsistent responders biased the estimated dimensionality and reliability of the scales. The impact on external validity measures was limited and unsystematic. We discuss implications for the use and development of questionnaire scales.","PeriodicalId":51515,"journal":{"name":"Assessment in Education-Principles Policy & Practice","volume":"16 1","pages":"5 - 26"},"PeriodicalIF":2.7000,"publicationDate":"2021-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"The impact of inconsistent responders to mixed-worded scales on inferences in international large-scale assessments\",\"authors\":\"Isa Steinmann, Daniel Sánchez, Saskia van Laar, J. Braeken\",\"doi\":\"10.1080/0969594X.2021.2005302\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT Questionnaire scales that are mixed-worded, i.e. include both positively and negatively worded items, often suffer from issues like low reliability and more complex latent structures than intended. Part of the problem might be that some responders fail to respond consistently to the mixed-worded items. We investigated the prevalence and impact of inconsistent responders in 37 primary education systems participating in the joint PIRLS/TIMSS 2011 assessment. Using the mean absolute difference method and three mixed-worded self-concept scales, we identified between 2%‒36% of students as inconsistent responders across education systems. Consistent with expectations, these students showed lower average achievement scores and had a higher risk of being identified as inconsistent on more than one scale. We also found that the inconsistent responders biased the estimated dimensionality and reliability of the scales. The impact on external validity measures was limited and unsystematic. We discuss implications for the use and development of questionnaire scales.\",\"PeriodicalId\":51515,\"journal\":{\"name\":\"Assessment in Education-Principles Policy & Practice\",\"volume\":\"16 1\",\"pages\":\"5 - 26\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2021-11-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Assessment in Education-Principles Policy & Practice\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://doi.org/10.1080/0969594X.2021.2005302\",\"RegionNum\":3,\"RegionCategory\":\"教育学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"EDUCATION & EDUCATIONAL RESEARCH\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Assessment in Education-Principles Policy & Practice","FirstCategoryId":"95","ListUrlMain":"https://doi.org/10.1080/0969594X.2021.2005302","RegionNum":3,"RegionCategory":"教育学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"EDUCATION & EDUCATIONAL RESEARCH","Score":null,"Total":0}
引用次数: 3

摘要

摘要问卷量表的混合措辞,即同时包含积极和消极措辞,往往存在可靠性低和潜在结构比预期更复杂等问题。这个问题的部分原因可能是一些响应者不能始终如一地对混合词的项目作出反应。我们调查了参与PIRLS/TIMSS 2011联合评估的37个小学教育系统中不一致响应者的患病率和影响。使用平均绝对差法和三个混合措辞的自我概念量表,我们确定了2%-36%的学生在不同的教育系统中是不一致的反应者。与预期一致的是,这些学生的平均成绩较低,而且在多个量表上被认定为不一致的风险更高。我们还发现不一致的应答者对量表的估计维度和可靠性有偏差。对外部效度测量的影响是有限的和不系统的。我们讨论了问卷量表的使用和发展的意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The impact of inconsistent responders to mixed-worded scales on inferences in international large-scale assessments
ABSTRACT Questionnaire scales that are mixed-worded, i.e. include both positively and negatively worded items, often suffer from issues like low reliability and more complex latent structures than intended. Part of the problem might be that some responders fail to respond consistently to the mixed-worded items. We investigated the prevalence and impact of inconsistent responders in 37 primary education systems participating in the joint PIRLS/TIMSS 2011 assessment. Using the mean absolute difference method and three mixed-worded self-concept scales, we identified between 2%‒36% of students as inconsistent responders across education systems. Consistent with expectations, these students showed lower average achievement scores and had a higher risk of being identified as inconsistent on more than one scale. We also found that the inconsistent responders biased the estimated dimensionality and reliability of the scales. The impact on external validity measures was limited and unsystematic. We discuss implications for the use and development of questionnaire scales.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Assessment in Education-Principles Policy & Practice
Assessment in Education-Principles Policy & Practice EDUCATION & EDUCATIONAL RESEARCH-
CiteScore
5.70
自引率
3.10%
发文量
29
期刊介绍: Recent decades have witnessed significant developments in the field of educational assessment. New approaches to the assessment of student achievement have been complemented by the increasing prominence of educational assessment as a policy issue. In particular, there has been a growth of interest in modes of assessment that promote, as well as measure, standards and quality. These have profound implications for individual learners, institutions and the educational system itself. Assessment in Education provides a focus for scholarly output in the field of assessment. The journal is explicitly international in focus and encourages contributions from a wide range of assessment systems and cultures. The journal''s intention is to explore both commonalities and differences in policy and practice.
期刊最新文献
EduSEL-R – the refined educators’ social-emotional learning questionnaire: expanded scope and improved validity Mapping oral feedback interactions in young pupils’ writing A self-feedback model (SEFEMO): secondary and higher education students’ self-assessment profiles Surprising Insights: rethinking Grades, Exams, and Assessment Cultures The conceptualisation implies the statistical model: implications for measuring domains of teaching quality
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1