Multiple mechanisms of visual prediction as revealed by the timecourse of scene-object facilitation.

Psychophysiology Pub Date : 2024-05-01 Epub Date: 2024-01-05 DOI:10.1111/psyp.14503
Cybelle M Smith, Kara D Federmeier
{"title":"Multiple mechanisms of visual prediction as revealed by the timecourse of scene-object facilitation.","authors":"Cybelle M Smith, Kara D Federmeier","doi":"10.1111/psyp.14503","DOIUrl":null,"url":null,"abstract":"<p><p>Not only semantic, but also recently learned arbitrary associations have the potential to facilitate visual processing in everyday life-for example, knowledge of a (moveable) object's location at a specific time may facilitate visual processing of that object. In our prior work, we showed that previewing a scene can facilitate processing of recently associated objects at the level of visual analysis (Smith and Federmeier in Journal of Cognitive Neuroscience, 32(5):783-803, 2020). In the current study, we assess how rapidly this facilitation unfolds by manipulating scene preview duration. We then compare our results to studies using well-learned object-scene associations in a first-pass assessment of whether systems consolidation might speed up high-level visual prediction. In two ERP experiments (N = 60), we had participants study categorically organized novel object-scene pairs in an explicit paired associate learning task. At test, we varied contextual pre-exposure duration, both between (200 vs. 2500 ms) and within subjects (0-2500 ms). We examined the N300, an event-related potential component linked to high-level visual processing of objects and scenes and found that N300 effects of scene congruity increase with longer scene previews, up to approximately 1-2 s. Similar results were obtained for response times and in a separate component-neutral ERP analysis of visual template matching. Our findings contrast with prior evidence that scenes can rapidly facilitate visual processing of commonly associated objects. This raises the possibility that systems consolidation might mediate different kinds of predictive processing with different temporal profiles.</p>","PeriodicalId":94182,"journal":{"name":"Psychophysiology","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11021179/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychophysiology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/psyp.14503","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/5 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Not only semantic, but also recently learned arbitrary associations have the potential to facilitate visual processing in everyday life-for example, knowledge of a (moveable) object's location at a specific time may facilitate visual processing of that object. In our prior work, we showed that previewing a scene can facilitate processing of recently associated objects at the level of visual analysis (Smith and Federmeier in Journal of Cognitive Neuroscience, 32(5):783-803, 2020). In the current study, we assess how rapidly this facilitation unfolds by manipulating scene preview duration. We then compare our results to studies using well-learned object-scene associations in a first-pass assessment of whether systems consolidation might speed up high-level visual prediction. In two ERP experiments (N = 60), we had participants study categorically organized novel object-scene pairs in an explicit paired associate learning task. At test, we varied contextual pre-exposure duration, both between (200 vs. 2500 ms) and within subjects (0-2500 ms). We examined the N300, an event-related potential component linked to high-level visual processing of objects and scenes and found that N300 effects of scene congruity increase with longer scene previews, up to approximately 1-2 s. Similar results were obtained for response times and in a separate component-neutral ERP analysis of visual template matching. Our findings contrast with prior evidence that scenes can rapidly facilitate visual processing of commonly associated objects. This raises the possibility that systems consolidation might mediate different kinds of predictive processing with different temporal profiles.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
场景-物体促进的时间历程揭示了视觉预测的多重机制。
在日常生活中,不仅语义关联,而且最近学习到的任意关联都有可能促进视觉处理--例如,了解一个(可移动)物体在特定时间的位置可能会促进对该物体的视觉处理。在我们之前的研究中,我们发现预览场景可以在视觉分析层面上促进对最近关联物体的处理(Smith 和 Federmeier,载于《认知神经科学杂志》,32(5):783-803, 2020)。在本研究中,我们通过操纵场景预览持续时间来评估这种促进作用的展开速度。然后,我们将研究结果与使用已掌握的物体-场景关联的研究结果进行比较,以初步评估系统巩固是否会加快高级视觉预测的速度。在两项ERP实验(N = 60)中,我们让参与者在明确的配对联想学习任务中学习分类组织的新颖物体-场景对。在测试中,我们改变了情境暴露前的持续时间,包括受试者之间(200 毫秒 vs. 2500 毫秒)和受试者内部(0-2500 毫秒)。我们检测了 N300(一种与物体和场景的高级视觉处理相关的事件相关电位成分),发现场景一致性的 N300 效应随着场景预览时间的延长而增加,最长可达约 1-2 秒。对反应时间和视觉模板匹配的单独中性成分 ERP 分析也得到了类似的结果。我们的研究结果与之前的证据形成了鲜明对比,之前的证据表明场景可以快速促进对共同关联物体的视觉处理。这就提出了一种可能性,即系统整合可能以不同的时间轮廓介导不同类型的预测处理。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
How effort-based self-interest motivation shapes altruistic donation behavior and brain responses. Beyond peaks and troughs: Multiplexed performance monitoring signals in the EEG. Reduced reward responsiveness and depression vulnerability: Consideration of social contexts and implications for intervention. Moving toward reality: Electrocortical reactivity to naturalistic multimodal emotional videos. Mapping the routes of perception: Hemispheric asymmetries in signal propagation dynamics.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1