Alexis Pérez-Bellido, Joan López-Moliner, S. Soto-Faraco
{"title":"Sounds prevent selective monitoring of high spatial frequency channels in vision","authors":"Alexis Pérez-Bellido, Joan López-Moliner, S. Soto-Faraco","doi":"10.1163/187847612X646622","DOIUrl":null,"url":null,"abstract":"Prior knowledge about the spatial frequency (SF) of upcoming visual targets (Gabor patches) speeds up average reaction times and decreases standard deviation. This has often been regarded as evidence for a multichannel processing of SF in vision. Multisensory research, on the other hand, has often reported the existence of sensory interactions between auditory and visual signals. These interactions result in enhancements in visual processing, leading to lower sensory thresholds and/or more precise visual estimates. However, little is known about how multisensory interactions may affect the uncertainty regarding visual SF. We conducted a reaction time study in which we manipulated the uncertanty about SF (SF was blocked or interleaved across trials) of visual targets, and compared visual only versus audio–visual presentations. Surprisingly, the analysis of the reaction times and their standard deviation revealed an impairment of the selective monitoring of the SF channel by the presence of a concurrent sound. Moreover, this impairment was especially pronounced when the relevant channels were high SFs at high visual contrasts. We propose that an accessory sound automatically favours visual processing of low SFs through the magnocellular channels, thereby detracting from the potential benefits from tuning into high SF psychophysical-channels.","PeriodicalId":49553,"journal":{"name":"Seeing and Perceiving","volume":"25 1","pages":"40-40"},"PeriodicalIF":0.0000,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1163/187847612X646622","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Seeing and Perceiving","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1163/187847612X646622","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Prior knowledge about the spatial frequency (SF) of upcoming visual targets (Gabor patches) speeds up average reaction times and decreases standard deviation. This has often been regarded as evidence for a multichannel processing of SF in vision. Multisensory research, on the other hand, has often reported the existence of sensory interactions between auditory and visual signals. These interactions result in enhancements in visual processing, leading to lower sensory thresholds and/or more precise visual estimates. However, little is known about how multisensory interactions may affect the uncertainty regarding visual SF. We conducted a reaction time study in which we manipulated the uncertanty about SF (SF was blocked or interleaved across trials) of visual targets, and compared visual only versus audio–visual presentations. Surprisingly, the analysis of the reaction times and their standard deviation revealed an impairment of the selective monitoring of the SF channel by the presence of a concurrent sound. Moreover, this impairment was especially pronounced when the relevant channels were high SFs at high visual contrasts. We propose that an accessory sound automatically favours visual processing of low SFs through the magnocellular channels, thereby detracting from the potential benefits from tuning into high SF psychophysical-channels.