{"title":"Automatic multisensory integration follows subjective confidence rather than objective performance.","authors":"Yi Gao, Kai Xue, Brian Odegaard, Dobromir Rahnev","doi":"10.1101/2023.06.07.544029","DOIUrl":null,"url":null,"abstract":"<p><p>It is well known that sensory information from one modality can automatically affect judgments from a different sensory modality. However, it remains unclear what determines the strength of the influence of an irrelevant sensory cue from one modality on a perceptual judgment for a different modality. Here we test whether the strength of multisensory impact by an irrelevant sensory cue depends on participants' objective accuracy or subjective confidence for that cue. We created visual motion stimuli with low vs. high overall motion energy, where high-energy stimuli yielded higher confidence but lower accuracy in a visual-only task. We then tested the impact of the low- and high-energy visual stimuli on auditory motion perception. We found that the high-energy visual stimuli influenced the auditory motion judgments more strongly than the low-energy visual stimuli, consistent with their higher confidence but contrary to their lower accuracy. A computational model assuming common principles underlying confidence reports and multisensory integration captured these effects. Our findings show that automatic multisensory integration follows subjective confidence rather than objective performance and suggest the existence of common computations across vastly different stages of perceptual decision making.</p>","PeriodicalId":72407,"journal":{"name":"bioRxiv : the preprint server for biology","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2025-01-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10274803/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"bioRxiv : the preprint server for biology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1101/2023.06.07.544029","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
It is well known that sensory information from one modality can automatically affect judgments from a different sensory modality. However, it remains unclear what determines the strength of the influence of an irrelevant sensory cue from one modality on a perceptual judgment for a different modality. Here we test whether the strength of multisensory impact by an irrelevant sensory cue depends on participants' objective accuracy or subjective confidence for that cue. We created visual motion stimuli with low vs. high overall motion energy, where high-energy stimuli yielded higher confidence but lower accuracy in a visual-only task. We then tested the impact of the low- and high-energy visual stimuli on auditory motion perception. We found that the high-energy visual stimuli influenced the auditory motion judgments more strongly than the low-energy visual stimuli, consistent with their higher confidence but contrary to their lower accuracy. A computational model assuming common principles underlying confidence reports and multisensory integration captured these effects. Our findings show that automatic multisensory integration follows subjective confidence rather than objective performance and suggest the existence of common computations across vastly different stages of perceptual decision making.