Separation of Traits and Extreme Response Style in IRTree Models: The Role of Mimicry Effects for the Meaningful Interpretation of Estimates

IF 2.1 3区 心理学 Q2 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS Educational and Psychological Measurement Pub Date : 2023-12-22 DOI:10.1177/00131644231213319
Viola Merhof, Caroline M. Böhm, Thorsten Meiser
{"title":"Separation of Traits and Extreme Response Style in IRTree Models: The Role of Mimicry Effects for the Meaningful Interpretation of Estimates","authors":"Viola Merhof, Caroline M. Böhm, Thorsten Meiser","doi":"10.1177/00131644231213319","DOIUrl":null,"url":null,"abstract":"Item response tree (IRTree) models are a flexible framework to control self-reported trait measurements for response styles. To this end, IRTree models decompose the responses to rating items into sub-decisions, which are assumed to be made on the basis of either the trait being measured or a response style, whereby the effects of such person parameters can be separated from each other. Here we investigate conditions under which the substantive meanings of estimated extreme response style parameters are potentially invalid and do not correspond to the meanings attributed to them, that is, content-unrelated category preferences. Rather, the response style factor may mimic the trait and capture part of the trait-induced variance in item responding, thus impairing the meaningful separation of the person parameters. Such a mimicry effect is manifested in a biased estimation of the covariance of response style and trait, as well as in an overestimation of the response style variance. Both can lead to severely misleading conclusions drawn from IRTree analyses. A series of simulation studies reveals that mimicry effects depend on the distribution of observed responses and that the estimation biases are stronger the more asymmetrically the responses are distributed across the rating scale. It is further demonstrated that extending the commonly used IRTree model with unidimensional sub-decisions by multidimensional parameterizations counteracts mimicry effects and facilitates the meaningful separation of parameters. An empirical example of the Program for International Student Assessment (PISA) background questionnaire illustrates the threat of mimicry effects in real data. The implications of applying IRTree models for empirical research questions are discussed.","PeriodicalId":11502,"journal":{"name":"Educational and Psychological Measurement","volume":"179 1","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2023-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Educational and Psychological Measurement","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1177/00131644231213319","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Item response tree (IRTree) models are a flexible framework to control self-reported trait measurements for response styles. To this end, IRTree models decompose the responses to rating items into sub-decisions, which are assumed to be made on the basis of either the trait being measured or a response style, whereby the effects of such person parameters can be separated from each other. Here we investigate conditions under which the substantive meanings of estimated extreme response style parameters are potentially invalid and do not correspond to the meanings attributed to them, that is, content-unrelated category preferences. Rather, the response style factor may mimic the trait and capture part of the trait-induced variance in item responding, thus impairing the meaningful separation of the person parameters. Such a mimicry effect is manifested in a biased estimation of the covariance of response style and trait, as well as in an overestimation of the response style variance. Both can lead to severely misleading conclusions drawn from IRTree analyses. A series of simulation studies reveals that mimicry effects depend on the distribution of observed responses and that the estimation biases are stronger the more asymmetrically the responses are distributed across the rating scale. It is further demonstrated that extending the commonly used IRTree model with unidimensional sub-decisions by multidimensional parameterizations counteracts mimicry effects and facilitates the meaningful separation of parameters. An empirical example of the Program for International Student Assessment (PISA) background questionnaire illustrates the threat of mimicry effects in real data. The implications of applying IRTree models for empirical research questions are discussed.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
IRTree 模型中特质与极端反应风格的分离:模仿效应对有意义地解释估计值的作用
项目反应树(IRTree)模型是一种灵活的框架,用于控制自我报告特质测量的反应风格。为此,IRTree 模型将对评分项目的反应分解为若干子决定,并假定这些子决定是根据所测量的特质或反应风格做出的,这样就可以将这些人的参数的影响彼此分开。在此,我们研究了在哪些条件下,估计的极端反应风格参数的实质含义可能无效,并且与归因于它们的含义(即与内容无关的类别偏好)不一致。相反,反应风格因子可能会模仿特质,并捕捉到项目反应中部分由特质引起的变异,从而损害了人称参数的意义分离。这种模仿效应表现为对反应风格和特质的协方差估计有偏差,以及对反应风格方差估计过高。这两种情况都会严重误导 IRTree 分析得出的结论。一系列模拟研究表明,模仿效应取决于观察到的反应的分布情况,反应在评分量表中的分布越不对称,估计偏差就越大。研究还进一步证明,通过多维参数化扩展常用的 IRTree 模型,使其具有单维子决策,可以抵消模仿效应,并促进参数的有意义分离。以国际学生评估项目(PISA)背景调查问卷为例,说明了真实数据中模仿效应的威胁。本文还讨论了将 IRTree 模型应用于实证研究问题的意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Educational and Psychological Measurement
Educational and Psychological Measurement 医学-数学跨学科应用
CiteScore
5.50
自引率
7.40%
发文量
49
审稿时长
6-12 weeks
期刊介绍: Educational and Psychological Measurement (EPM) publishes referred scholarly work from all academic disciplines interested in the study of measurement theory, problems, and issues. Theoretical articles address new developments and techniques, and applied articles deal with innovation applications.
期刊最新文献
Discriminant Validity of Interval Response Formats: Investigating the Dimensional Structure of Interval Widths. Novick Meets Bayes: Improving the Assessment of Individual Students in Educational Practice and Research by Capitalizing on Assessors' Prior Beliefs. Differential Item Functioning Effect Size Use for Validity Information. Optimal Number of Replications for Obtaining Stable Dynamic Fit Index Cutoffs. Invariance: What Does Measurement Invariance Allow Us to Claim?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1