A Framework for Detecting Both Main Effect and Interactive DIF in Multidimensional Forced-Choice Assessments

IF 8.9 2区 管理学 Q1 MANAGEMENT Organizational Research Methods Pub Date : 2024-04-13 DOI:10.1177/10944281241244760
Kai Liu, Yi Zheng, Daxun Wang, Yan Cai, Yuanyuan Shi, Chongqin Xi, Dongbo Tu
{"title":"A Framework for Detecting Both Main Effect and Interactive DIF in Multidimensional Forced-Choice Assessments","authors":"Kai Liu, Yi Zheng, Daxun Wang, Yan Cai, Yuanyuan Shi, Chongqin Xi, Dongbo Tu","doi":"10.1177/10944281241244760","DOIUrl":null,"url":null,"abstract":"In recent decades, multidimensional forced-choice (MFC) tests have gained widespread popularity in organizational settings due to their effectiveness in reducing response biases. Detecting differential item functioning (DIF) is crucial in developing MFC tests, as it relates to test fairness and validity. However, existing methods appear insufficient for detecting DIF induced by the interaction between multiple covariates. Furthermore, for multi-category, ordered or continuous covariates, existing approaches often dichotomize them using a-priori cutoffs, commonly using the median of the covariates. This may lead to information loss and reduced power in detecting MFC DIF. To address these limitations, we propose a method to identify both main effect DIF and interactive DIF. This method can automatically search for the optimal cutoffs for ordered or continuous covariates without pre-defined cutoffs. We introduce the rationale behind the proposed method and evaluate its performance through three Monte Carlo simulation studies. Results demonstrate that the proposed method effectively identifies various DIF forms in MFC tests, thereby increasing detection power. Finally, we provide an empirical application to illustrate the practical applicability of the proposed method.","PeriodicalId":19689,"journal":{"name":"Organizational Research Methods","volume":null,"pages":null},"PeriodicalIF":8.9000,"publicationDate":"2024-04-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Organizational Research Methods","FirstCategoryId":"91","ListUrlMain":"https://doi.org/10.1177/10944281241244760","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MANAGEMENT","Score":null,"Total":0}
引用次数: 0

Abstract

In recent decades, multidimensional forced-choice (MFC) tests have gained widespread popularity in organizational settings due to their effectiveness in reducing response biases. Detecting differential item functioning (DIF) is crucial in developing MFC tests, as it relates to test fairness and validity. However, existing methods appear insufficient for detecting DIF induced by the interaction between multiple covariates. Furthermore, for multi-category, ordered or continuous covariates, existing approaches often dichotomize them using a-priori cutoffs, commonly using the median of the covariates. This may lead to information loss and reduced power in detecting MFC DIF. To address these limitations, we propose a method to identify both main effect DIF and interactive DIF. This method can automatically search for the optimal cutoffs for ordered or continuous covariates without pre-defined cutoffs. We introduce the rationale behind the proposed method and evaluate its performance through three Monte Carlo simulation studies. Results demonstrate that the proposed method effectively identifies various DIF forms in MFC tests, thereby increasing detection power. Finally, we provide an empirical application to illustrate the practical applicability of the proposed method.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
在多维强制选择测评中检测主效应和交互式 DIF 的框架
近几十年来,多维强迫选择(MFC)测验因其在减少反应偏差方面的有效性而在组织机构中得到了广泛的普及。检测项目功能差异(DIF)对开发 MFC 测试至关重要,因为它关系到测试的公平性和有效性。然而,现有的方法似乎不足以检测由多个协变量之间的交互作用引起的 DIF。此外,对于多类别、有序或连续的协变量,现有方法通常使用先验截断点(通常使用协变量的中位数)对其进行二分。这可能会导致信息丢失,降低检测 MFC DIF 的能力。为了解决这些局限性,我们提出了一种同时识别主效应 DIF 和交互式 DIF 的方法。这种方法可以自动搜索有序或连续协变量的最佳临界点,而无需预先设定临界点。我们介绍了所提方法背后的原理,并通过三项蒙特卡罗模拟研究对其性能进行了评估。结果表明,所提方法能有效识别 MFC 检验中的各种 DIF 形式,从而提高检测能力。最后,我们提供了一个经验应用,以说明所提方法的实际适用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
23.20
自引率
3.20%
发文量
17
期刊介绍: Organizational Research Methods (ORM) was founded with the aim of introducing pertinent methodological advancements to researchers in organizational sciences. The objective of ORM is to promote the application of current and emerging methodologies to advance both theory and research practices. Articles are expected to be comprehensible to readers with a background consistent with the methodological and statistical training provided in contemporary organizational sciences doctoral programs. The text should be presented in a manner that facilitates accessibility. For instance, highly technical content should be placed in appendices, and authors are encouraged to include example data and computer code when relevant. Additionally, authors should explicitly outline how their contribution has the potential to advance organizational theory and research practice.
期刊最新文献
One Size Does Not Fit All: Unraveling Item Response Process Heterogeneity Using the Mixture Dominance-Unfolding Model (MixDUM) Taking It Easy: Off-the-Shelf Versus Fine-Tuned Supervised Modeling of Performance Appraisal Text Hello World! Building Computational Models to Represent Social and Organizational Theory The Effects of the Training Sample Size, Ground Truth Reliability, and NLP Method on Language-Based Automatic Interview Scores’ Psychometric Properties Enhancing Causal Pursuits in Organizational Science: Targeting the Effect of Treatment on the Treated in Research on Vulnerable Populations
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1