Evaluating the complexity and falsifiability of psychological models.

IF 5.1 1区 心理学 Q1 PSYCHOLOGY Psychological review Pub Date : 2023-07-01 DOI:10.1037/rev0000421
Manuel Villarreal, Alexander Etz, Michael D Lee
{"title":"Evaluating the complexity and falsifiability of psychological models.","authors":"Manuel Villarreal,&nbsp;Alexander Etz,&nbsp;Michael D Lee","doi":"10.1037/rev0000421","DOIUrl":null,"url":null,"abstract":"<p><p>Understanding model complexity is important for developing useful psychological models. One way to think about model complexity is in terms of the predictions a model makes and the ability of empirical evidence to falsify those predictions. We argue that existing measures of falsifiability have important limitations and develop a new measure. KL-delta uses Kullback-Leibler divergence to compare the prior predictive distributions of models to the data prior that formalizes knowledge about the plausibility of different experimental outcomes. Using introductory conceptual examples and applications with existing models and experiments, we show that KL-delta challenges widely held scientific intuitions about model complexity and falsifiability. In a psychophysics application, we show that hierarchical models with more parameters are often more falsifiable than the original nonhierarchical model. This counters the intuition that adding parameters always makes a model more complex. In a decision-making application, we show that a choice model incorporating response determinism can be harder to falsify than its special case of probability matching. This counters the intuition that if one model is a special case of another, the special case must be less complex. In a memory recall application, we show that using informative data priors based on the serial position curve allows KL-delta to distinguish models that otherwise would be indistinguishable. This shows the value in model evaluation of extending the notion of possible falsifiability, in which all data are considered equally likely, to the more general notion of plausible falsifiability, in which some data are more likely than others. (PsycInfo Database Record (c) 2023 APA, all rights reserved).</p>","PeriodicalId":21016,"journal":{"name":"Psychological review","volume":"130 4","pages":"853-872"},"PeriodicalIF":5.1000,"publicationDate":"2023-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychological review","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/rev0000421","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

Understanding model complexity is important for developing useful psychological models. One way to think about model complexity is in terms of the predictions a model makes and the ability of empirical evidence to falsify those predictions. We argue that existing measures of falsifiability have important limitations and develop a new measure. KL-delta uses Kullback-Leibler divergence to compare the prior predictive distributions of models to the data prior that formalizes knowledge about the plausibility of different experimental outcomes. Using introductory conceptual examples and applications with existing models and experiments, we show that KL-delta challenges widely held scientific intuitions about model complexity and falsifiability. In a psychophysics application, we show that hierarchical models with more parameters are often more falsifiable than the original nonhierarchical model. This counters the intuition that adding parameters always makes a model more complex. In a decision-making application, we show that a choice model incorporating response determinism can be harder to falsify than its special case of probability matching. This counters the intuition that if one model is a special case of another, the special case must be less complex. In a memory recall application, we show that using informative data priors based on the serial position curve allows KL-delta to distinguish models that otherwise would be indistinguishable. This shows the value in model evaluation of extending the notion of possible falsifiability, in which all data are considered equally likely, to the more general notion of plausible falsifiability, in which some data are more likely than others. (PsycInfo Database Record (c) 2023 APA, all rights reserved).

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
评估心理模型的复杂性和可证伪性。
理解模型的复杂性对于开发有用的心理模型非常重要。考虑模型复杂性的一种方式是根据模型做出的预测和经验证据证伪这些预测的能力。我们认为现有的可证伪性测度存在重要的局限性,并提出了一种新的测度。KL-delta使用Kullback-Leibler散度来比较模型的先验预测分布和关于不同实验结果的合理性形式化知识的先验数据。通过介绍概念示例以及现有模型和实验的应用,我们表明KL-delta挑战了关于模型复杂性和可证伪性的广泛科学直觉。在一个心理物理学的应用中,我们证明了具有更多参数的层次模型通常比原始的非层次模型更容易证伪。这与增加参数总是使模型更复杂的直觉相反。在决策应用中,我们证明了包含响应决定论的选择模型比其概率匹配的特殊情况更难以证伪。这与直觉相反,即如果一个模型是另一个模型的特殊情况,那么特殊情况一定不那么复杂。在一个记忆召回应用中,我们表明使用基于序列位置曲线的信息数据先验允许KL-delta区分模型,否则将无法区分。这显示了将可能可证伪性的概念扩展到更一般的似是而非的可证伪性概念(其中一些数据比其他数据更有可能)在模型评估中的价值。(PsycInfo数据库记录(c) 2023 APA,版权所有)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Psychological review
Psychological review 医学-心理学
CiteScore
9.70
自引率
5.60%
发文量
97
期刊介绍: Psychological Review publishes articles that make important theoretical contributions to any area of scientific psychology, including systematic evaluation of alternative theories.
期刊最新文献
How does depressive cognition develop? A state-dependent network model of predictive processing. Bouncing back from life's perturbations: Formalizing psychological resilience from a complex systems perspective. The meaning of attention control. Counterfactuals and the logic of causal selection. The relation between learning and stimulus-response binding.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1