Object-based encoding constrains storage in visual working memory.

IF 3.7 1区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL Journal of Experimental Psychology: General Pub Date : 2024-01-01 Epub Date: 2023-09-11 DOI:10.1037/xge0001479
William X Q Ngiam, Krystian B Loetscher, Edward Awh
{"title":"Object-based encoding constrains storage in visual working memory.","authors":"William X Q Ngiam, Krystian B Loetscher, Edward Awh","doi":"10.1037/xge0001479","DOIUrl":null,"url":null,"abstract":"<p><p>The fundamental unit of visual working memory (WM) has been debated for decades. WM could be object-based, such that capacity is set by the number of individuated objects, or feature-based, such that capacity is determined by the total number of feature values stored. The present work examined whether object- or feature-based models would best explain how multifeature objects (i.e., color/orientation or color/shape) are encoded into visual WM. If maximum capacity is limited by the number of individuated objects, then above-chance performance should be restricted to the same number of items as in a single-feature condition. By contrast, if the capacity is determined by independent storage resources for distinct features-without respect to the objects that contain those features-then successful storage of feature values could be distributed across a larger number of objects than when only a single feature is relevant. We conducted four experiments using a whole-report task in which subjects reported both features from every item in a six-item array. The crucial finding was that above-chance recall-for both single- and multifeatured objects-was restricted to the first three or four responses, while the later responses were best modeled as guesses. Thus, whole-report with multifeature objects reveals a distribution of recalled features that indicates an object-based limit on WM capacity. (PsycInfo Database Record (c) 2024 APA, all rights reserved).</p>","PeriodicalId":15698,"journal":{"name":"Journal of Experimental Psychology: General","volume":" ","pages":"86-101"},"PeriodicalIF":3.7000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10840914/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Experimental Psychology: General","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1037/xge0001479","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2023/9/11 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

Abstract

The fundamental unit of visual working memory (WM) has been debated for decades. WM could be object-based, such that capacity is set by the number of individuated objects, or feature-based, such that capacity is determined by the total number of feature values stored. The present work examined whether object- or feature-based models would best explain how multifeature objects (i.e., color/orientation or color/shape) are encoded into visual WM. If maximum capacity is limited by the number of individuated objects, then above-chance performance should be restricted to the same number of items as in a single-feature condition. By contrast, if the capacity is determined by independent storage resources for distinct features-without respect to the objects that contain those features-then successful storage of feature values could be distributed across a larger number of objects than when only a single feature is relevant. We conducted four experiments using a whole-report task in which subjects reported both features from every item in a six-item array. The crucial finding was that above-chance recall-for both single- and multifeatured objects-was restricted to the first three or four responses, while the later responses were best modeled as guesses. Thus, whole-report with multifeature objects reveals a distribution of recalled features that indicates an object-based limit on WM capacity. (PsycInfo Database Record (c) 2024 APA, all rights reserved).

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于对象的编码限制了视觉工作记忆中的存储。
视觉工作记忆的基本单位已经争论了几十年。WM可以是基于对象的,使得容量由单个对象的数量设置,或者基于特征的,使得能力由存储的特征值的总数确定。目前的工作考察了基于对象或特征的模型是否能最好地解释多特征对象(即颜色/方向或颜色/形状)如何被编码到视觉WM中。如果最大容量受到单个对象数量的限制,那么机会性能应限制在与单个特征条件下相同的项目数量。相反,如果容量是由不同特征的独立存储资源决定的,而不考虑包含这些特征的对象,那么与仅单个特征相关时相比,特征值的成功存储可以分布在更多的对象上。我们使用一个完整的报告任务进行了四个实验,受试者报告了六个项目阵列中每个项目的两个特征。关键的发现是,单特征和多特征物体的偶然回忆都仅限于前三个或四个反应,而后面的反应最好被建模为猜测。因此,具有多特征对象的整个报告揭示了召回特征的分布,这表明基于对象的WM容量限制。(PsycInfo数据库记录(c)2023 APA,保留所有权利)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.20
自引率
4.90%
发文量
300
期刊介绍: The Journal of Experimental Psychology: General publishes articles describing empirical work that bridges the traditional interests of two or more communities of psychology. The work may touch on issues dealt with in JEP: Learning, Memory, and Cognition, JEP: Human Perception and Performance, JEP: Animal Behavior Processes, or JEP: Applied, but may also concern issues in other subdisciplines of psychology, including social processes, developmental processes, psychopathology, neuroscience, or computational modeling. Articles in JEP: General may be longer than the usual journal publication if necessary, but shorter articles that bridge subdisciplines will also be considered.
期刊最新文献
Bypassing versus correcting misinformation: Efficacy and fundamental processes. Risky hybrid foraging: The impact of risk, reward value, and prevalence on foraging behavior in hybrid visual search. Shortcuts to insincerity: Texting abbreviations seem insincere and not worth answering. Confidence regulates feedback processing during human probabilistic learning. Does affective processing require awareness? On the use of the Perceptual Awareness Scale in response priming research.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1