评估结构方程建模中现有和新型拟合指数等效检验的性能。

IF 1.5 3区 心理学 Q3 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS British Journal of Mathematical & Statistical Psychology Pub Date : 2023-07-13 DOI:10.1111/bmsp.12317
Nataly Beribisky, Robert A. Cribbie
{"title":"评估结构方程建模中现有和新型拟合指数等效检验的性能。","authors":"Nataly Beribisky,&nbsp;Robert A. Cribbie","doi":"10.1111/bmsp.12317","DOIUrl":null,"url":null,"abstract":"<p>It has been suggested that equivalence testing (otherwise known as negligible effect testing) should be used to evaluate model fit within structural equation modelling (SEM). In this study, we propose novel variations of equivalence tests based on the popular root mean squared error of approximation and comparative fit index fit indices. Using Monte Carlo simulations, we compare the performance of these novel tests to other existing equivalence testing-based fit indices in SEM, as well as to other methods commonly used to evaluate model fit. Results indicate that equivalence tests in SEM have good Type I error control and display considerable power for detecting well-fitting models in medium to large sample sizes. At small sample sizes, relative to traditional fit indices, equivalence tests limit the chance of supporting a poorly fitting model. We also present an illustrative example to demonstrate how equivalence tests may be incorporated in model fit reporting. Equivalence tests in SEM also have unique interpretational advantages compared to other methods of model fit evaluation. We recommend that equivalence tests be utilized in conjunction with descriptive fit indices to provide more evidence when evaluating model fit.</p>","PeriodicalId":55322,"journal":{"name":"British Journal of Mathematical & Statistical Psychology","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2023-07-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1111/bmsp.12317","citationCount":"0","resultStr":"{\"title\":\"Evaluating the performance of existing and novel equivalence tests for fit indices in structural equation modelling\",\"authors\":\"Nataly Beribisky,&nbsp;Robert A. Cribbie\",\"doi\":\"10.1111/bmsp.12317\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>It has been suggested that equivalence testing (otherwise known as negligible effect testing) should be used to evaluate model fit within structural equation modelling (SEM). In this study, we propose novel variations of equivalence tests based on the popular root mean squared error of approximation and comparative fit index fit indices. Using Monte Carlo simulations, we compare the performance of these novel tests to other existing equivalence testing-based fit indices in SEM, as well as to other methods commonly used to evaluate model fit. Results indicate that equivalence tests in SEM have good Type I error control and display considerable power for detecting well-fitting models in medium to large sample sizes. At small sample sizes, relative to traditional fit indices, equivalence tests limit the chance of supporting a poorly fitting model. We also present an illustrative example to demonstrate how equivalence tests may be incorporated in model fit reporting. Equivalence tests in SEM also have unique interpretational advantages compared to other methods of model fit evaluation. We recommend that equivalence tests be utilized in conjunction with descriptive fit indices to provide more evidence when evaluating model fit.</p>\",\"PeriodicalId\":55322,\"journal\":{\"name\":\"British Journal of Mathematical & Statistical Psychology\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2023-07-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1111/bmsp.12317\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"British Journal of Mathematical & Statistical Psychology\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/bmsp.12317\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Journal of Mathematical & Statistical Psychology","FirstCategoryId":"102","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/bmsp.12317","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

有人建议,等效检验(又称可忽略效应检验)应被用于评估结构方程建模(SEM)中的模型拟合度。在本研究中,我们提出了基于流行的均方根近似误差和比较拟合指数拟合指数的新型等效检验变体。通过蒙特卡罗模拟,我们将这些新型检验的性能与 SEM 中其他现有的基于等效检验的拟合指数以及其他常用于评估模型拟合度的方法进行了比较。结果表明,SEM 中的等效检验具有良好的 I 类误差控制能力,在中到大型样本量中检测拟合良好的模型时显示出相当大的威力。在小样本量情况下,相对于传统的拟合指数,等效检验限制了支持拟合不良模型的机会。我们还将举例说明如何将等效检验纳入模型拟合报告。与其他模型拟合度评估方法相比,SEM 中的等效检验还具有独特的解释优势。我们建议将等效检验与描述性拟合指数结合使用,以便在评估模型拟合度时提供更多证据。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Evaluating the performance of existing and novel equivalence tests for fit indices in structural equation modelling

It has been suggested that equivalence testing (otherwise known as negligible effect testing) should be used to evaluate model fit within structural equation modelling (SEM). In this study, we propose novel variations of equivalence tests based on the popular root mean squared error of approximation and comparative fit index fit indices. Using Monte Carlo simulations, we compare the performance of these novel tests to other existing equivalence testing-based fit indices in SEM, as well as to other methods commonly used to evaluate model fit. Results indicate that equivalence tests in SEM have good Type I error control and display considerable power for detecting well-fitting models in medium to large sample sizes. At small sample sizes, relative to traditional fit indices, equivalence tests limit the chance of supporting a poorly fitting model. We also present an illustrative example to demonstrate how equivalence tests may be incorporated in model fit reporting. Equivalence tests in SEM also have unique interpretational advantages compared to other methods of model fit evaluation. We recommend that equivalence tests be utilized in conjunction with descriptive fit indices to provide more evidence when evaluating model fit.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
5.00
自引率
3.80%
发文量
34
审稿时长
>12 weeks
期刊介绍: The British Journal of Mathematical and Statistical Psychology publishes articles relating to areas of psychology which have a greater mathematical or statistical aspect of their argument than is usually acceptable to other journals including: • mathematical psychology • statistics • psychometrics • decision making • psychophysics • classification • relevant areas of mathematics, computing and computer software These include articles that address substantitive psychological issues or that develop and extend techniques useful to psychologists. New models for psychological processes, new approaches to existing data, critiques of existing models and improved algorithms for estimating the parameters of a model are examples of articles which may be favoured.
期刊最新文献
Investigating heterogeneity in IRTree models for multiple response processes with score-based partitioning. A convexity-constrained parameterization of the random effects generalized partial credit model. Handling missing data in variational autoencoder based item response theory. Maximal point-polyserial correlation for non-normal random distributions. Perturbation graphs, invariant causal prediction and causal relations in psychology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1