Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration

Jordi Navarra , Argiro Vatakis , Massimiliano Zampini , Salvador Soto-Faraco , William Humphreys , Charles Spence
{"title":"Exposure to asynchronous audiovisual speech extends the temporal window for audiovisual integration","authors":"Jordi Navarra ,&nbsp;Argiro Vatakis ,&nbsp;Massimiliano Zampini ,&nbsp;Salvador Soto-Faraco ,&nbsp;William Humphreys ,&nbsp;Charles Spence","doi":"10.1016/j.cogbrainres.2005.07.009","DOIUrl":null,"url":null,"abstract":"<div><p>We examined whether monitoring asynchronous audiovisual speech induces a general temporal recalibration of auditory and visual sensory processing. Participants monitored a videotape featuring a speaker pronouncing a list of words (Experiments 1 and 3) or a hand playing a musical pattern on a piano (Experiment 2). The auditory and visual channels were either presented in synchrony, or else asynchronously (with the visual signal leading the auditory signal by 300 ms; Experiments 1 and 2). While performing the monitoring task, participants were asked to judge the temporal order of pairs of auditory (white noise bursts) and visual stimuli (flashes) that were presented at varying stimulus onset asynchronies (SOAs) during the session. The results showed that, while monitoring desynchronized speech or music, participants required a longer interval between the auditory and visual stimuli in order to perceive their temporal order correctly, suggesting a widening of the temporal window for audiovisual integration. The fact that no such recalibration occurred when we used a longer asynchrony (1000 ms) that exceeded the temporal window for audiovisual integration (Experiment 3) supports this conclusion.</p></div>","PeriodicalId":100287,"journal":{"name":"Cognitive Brain Research","volume":"25 2","pages":"Pages 499-507"},"PeriodicalIF":0.0000,"publicationDate":"2005-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1016/j.cogbrainres.2005.07.009","citationCount":"182","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Brain Research","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0926641005002193","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 182

Abstract

We examined whether monitoring asynchronous audiovisual speech induces a general temporal recalibration of auditory and visual sensory processing. Participants monitored a videotape featuring a speaker pronouncing a list of words (Experiments 1 and 3) or a hand playing a musical pattern on a piano (Experiment 2). The auditory and visual channels were either presented in synchrony, or else asynchronously (with the visual signal leading the auditory signal by 300 ms; Experiments 1 and 2). While performing the monitoring task, participants were asked to judge the temporal order of pairs of auditory (white noise bursts) and visual stimuli (flashes) that were presented at varying stimulus onset asynchronies (SOAs) during the session. The results showed that, while monitoring desynchronized speech or music, participants required a longer interval between the auditory and visual stimuli in order to perceive their temporal order correctly, suggesting a widening of the temporal window for audiovisual integration. The fact that no such recalibration occurred when we used a longer asynchrony (1000 ms) that exceeded the temporal window for audiovisual integration (Experiment 3) supports this conclusion.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
暴露于异步视听语音扩展了视听整合的时间窗口
我们研究了监测异步视听语音是否会引起听觉和视觉感觉处理的一般时间重新校准。参与者观看了一盘录像带,录像带上有说话者念单词(实验1和3)或一只手在钢琴上弹奏音乐(实验2)。听觉和视觉通道要么同步呈现,要么不同步呈现(视觉信号比听觉信号超前300毫秒;实验1和2)。在执行监测任务时,参与者被要求判断在会话期间以不同刺激开始异步(soa)呈现的听觉(白噪声爆发)和视觉刺激(闪烁)对的时间顺序。结果表明,在监听不同步的语音或音乐时,参与者需要更长的听觉和视觉刺激间隔才能正确地感知其时间顺序,这表明视听整合的时间窗口扩大了。当我们使用超过视听整合时间窗口的较长异步时间(1000毫秒)时,没有发生这种重新校准(实验3),这支持了这一结论。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Editorial Board Author Index Optic flow dominates visual scene polarity in causing adaptive modification of locomotor trajectory Partial unilateral inactivation of the dorsal hippocampus impairs spatial memory in the MWM Accessing world knowledge: Evidence from N400 and reaction time priming
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1