多传感器决策过程中时变视觉和听觉证据的动态加权。

IF 1.8 4区 心理学 Q3 BIOPHYSICS Multisensory Research Pub Date : 2022-02-21 DOI:10.31234/osf.io/knzbv
Rosanne Roosmarijn Maria Tuip, W.M. van der Ham, Jeannette A. M. Lorteije, Filip Van Opstal
{"title":"多传感器决策过程中时变视觉和听觉证据的动态加权。","authors":"Rosanne Roosmarijn Maria Tuip, W.M. van der Ham, Jeannette A. M. Lorteije, Filip Van Opstal","doi":"10.31234/osf.io/knzbv","DOIUrl":null,"url":null,"abstract":"Perceptual decision-making in a dynamic environment requires two integration processes: integration of sensory evidence from multiple modalities to form a coherent representation of the environment, and integration of evidence across time to accurately make a decision. Only recently studies started to unravel how evidence from two modalities is accumulated across time to form a perceptual decision. One important question is whether information from individual senses contributes equally to multisensory decisions. We designed a new psychophysical task that measures how visual and auditory evidence is weighted across time. Participants were asked to discriminate between two visual gratings, and/or two sounds presented to the right and left ear based on respectively contrast and loudness. We varied the evidence, i.e., the contrast of the gratings and amplitude of the sound, over time. Results showed a significant increase in performance accuracy on multisensory trials compared to unisensory trials, indicating that discriminating between two sources is improved when multisensory information is available. Furthermore, we found that early evidence contributed most to sensory decisions. Weighting of unisensory information during audiovisual decision-making dynamically changed over time. A first epoch was characterized by both visual and auditory weighting, during the second epoch vision dominated and the third epoch finalized the weighting profile with auditory dominance. Our results suggest that during our task multisensory improvement is generated by a mechanism that requires cross-modal interactions but also dynamically evokes dominance switching.","PeriodicalId":51298,"journal":{"name":"Multisensory Research","volume":"36 1 1","pages":"31-56"},"PeriodicalIF":1.8000,"publicationDate":"2022-02-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Dynamic Weighting of Time-Varying Visual and Auditory Evidence During Multisensory Decision Making.\",\"authors\":\"Rosanne Roosmarijn Maria Tuip, W.M. van der Ham, Jeannette A. M. Lorteije, Filip Van Opstal\",\"doi\":\"10.31234/osf.io/knzbv\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Perceptual decision-making in a dynamic environment requires two integration processes: integration of sensory evidence from multiple modalities to form a coherent representation of the environment, and integration of evidence across time to accurately make a decision. Only recently studies started to unravel how evidence from two modalities is accumulated across time to form a perceptual decision. One important question is whether information from individual senses contributes equally to multisensory decisions. We designed a new psychophysical task that measures how visual and auditory evidence is weighted across time. Participants were asked to discriminate between two visual gratings, and/or two sounds presented to the right and left ear based on respectively contrast and loudness. We varied the evidence, i.e., the contrast of the gratings and amplitude of the sound, over time. Results showed a significant increase in performance accuracy on multisensory trials compared to unisensory trials, indicating that discriminating between two sources is improved when multisensory information is available. Furthermore, we found that early evidence contributed most to sensory decisions. Weighting of unisensory information during audiovisual decision-making dynamically changed over time. A first epoch was characterized by both visual and auditory weighting, during the second epoch vision dominated and the third epoch finalized the weighting profile with auditory dominance. Our results suggest that during our task multisensory improvement is generated by a mechanism that requires cross-modal interactions but also dynamically evokes dominance switching.\",\"PeriodicalId\":51298,\"journal\":{\"name\":\"Multisensory Research\",\"volume\":\"36 1 1\",\"pages\":\"31-56\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2022-02-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Multisensory Research\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.31234/osf.io/knzbv\",\"RegionNum\":4,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"BIOPHYSICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Multisensory Research","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.31234/osf.io/knzbv","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"BIOPHYSICS","Score":null,"Total":0}
引用次数: 0

摘要

动态环境下的感知决策需要两个整合过程:整合来自多种形态的感官证据以形成对环境的连贯表征;整合跨时间的证据以准确做出决策。直到最近,研究才开始揭示两种模式的证据是如何随着时间的推移而积累起来形成感知决策的。一个重要的问题是,来自单个感官的信息是否同样有助于多感官决策。我们设计了一个新的心理物理任务来衡量视觉和听觉证据是如何随时间加权的。参与者被要求根据对比度和响度区分两种视觉光栅和/或分别呈现给右耳和左耳的两种声音。随着时间的推移,我们改变了证据,即光栅的对比度和声音的振幅。结果显示,与单感觉试验相比,多感觉试验的性能准确性显著提高,这表明当有多感觉信息可用时,两个来源之间的区分得到了改善。此外,我们发现早期证据对感官决策贡献最大。视听决策过程中感官信息的权重随时间动态变化。第一阶段以视觉和听觉加权为主,第二阶段以视觉为主,第三阶段以听觉为主。我们的研究结果表明,在我们的任务过程中,多感官的改善是由一种需要跨模态交互的机制产生的,但也动态地唤起了优势转换。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Dynamic Weighting of Time-Varying Visual and Auditory Evidence During Multisensory Decision Making.
Perceptual decision-making in a dynamic environment requires two integration processes: integration of sensory evidence from multiple modalities to form a coherent representation of the environment, and integration of evidence across time to accurately make a decision. Only recently studies started to unravel how evidence from two modalities is accumulated across time to form a perceptual decision. One important question is whether information from individual senses contributes equally to multisensory decisions. We designed a new psychophysical task that measures how visual and auditory evidence is weighted across time. Participants were asked to discriminate between two visual gratings, and/or two sounds presented to the right and left ear based on respectively contrast and loudness. We varied the evidence, i.e., the contrast of the gratings and amplitude of the sound, over time. Results showed a significant increase in performance accuracy on multisensory trials compared to unisensory trials, indicating that discriminating between two sources is improved when multisensory information is available. Furthermore, we found that early evidence contributed most to sensory decisions. Weighting of unisensory information during audiovisual decision-making dynamically changed over time. A first epoch was characterized by both visual and auditory weighting, during the second epoch vision dominated and the third epoch finalized the weighting profile with auditory dominance. Our results suggest that during our task multisensory improvement is generated by a mechanism that requires cross-modal interactions but also dynamically evokes dominance switching.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Multisensory Research
Multisensory Research BIOPHYSICS-PSYCHOLOGY
CiteScore
3.50
自引率
12.50%
发文量
15
期刊介绍: Multisensory Research is an interdisciplinary archival journal covering all aspects of multisensory processing including the control of action, cognition and attention. Research using any approach to increase our understanding of multisensory perceptual, behavioural, neural and computational mechanisms is encouraged. Empirical, neurophysiological, psychophysical, brain imaging, clinical, developmental, mathematical and computational analyses are welcome. Research will also be considered covering multisensory applications such as sensory substitution, crossmodal methods for delivering sensory information or multisensory approaches to robotics and engineering. Short communications and technical notes that draw attention to new developments will be included, as will reviews and commentaries on current issues. Special issues dealing with specific topics will be announced from time to time. Multisensory Research is a continuation of Seeing and Perceiving, and of Spatial Vision.
期刊最新文献
Multisensory Integration of Native and Nonnative Speech in Bilingual and Monolingual Adults. The Impact of Viewing Distance and Proprioceptive Manipulations on a Virtual Reality Based Balance Test. What is the Relation between Chemosensory Perception and Chemosensory Mental Imagery? Evidence for a Causal Dissociation of the McGurk Effect and Congruent Audiovisual Speech Perception via TMS to the Left pSTS. Audiovisual Speech Perception Benefits are Stable from Preschool through Adolescence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1