{"title":"刺激驱动和表象驱动的跨模态注意扩散都会受到视听时间同步性的调节。","authors":"Song Zhao, Fangfang Ma, Jimei Xie, Yuxin Zhou, Chengzhi Feng, Wenfeng Feng","doi":"10.1111/psyp.14527","DOIUrl":null,"url":null,"abstract":"<p><p>Multisensory integration and attention can interact in a way that attention to the visual constituent of a multisensory object results in an attentional spreading to its ignored auditory constituent, which can be either stimulus-driven or representation-driven depending on whether the object's visual constituent receives extra representation-based selective attention. Previous research using simple unrelated audiovisual combinations has shown that the stimulus-driven attentional spreading is contingent on audiovisual temporal simultaneity. However, little is known about whether this temporal constraint applies also to the representation-driven attentional spreading, and whether it holds for the stimulus-driven process elicited by real-life multisensory objects. The current event-related potential study investigated these questions by systematically manipulating the visual-to-auditory stimulus onset asynchrony (SOA: 0/100/300 ms) in an object-selective visual recognition task wherein the representation-driven and stimulus-driven spreading processes, measured as two distinct auditory negative difference (Nd) components, could be isolated independently. Our results showed that both the representation-driven and stimulus-driven Nds decreased as the SOA increased. Interestingly, the representation-driven Nd was completely absent, whereas the stimulus-driven Nd was still robust, when the auditory constituents were delayed by 300 ms. These findings not only indicate that the role of audiovisual simultaneity in the representation-driven attentional spreading has been underestimated, but also suggest that learned associations between the unisensory constituents of real-life objects render the stimulus-driven attentional spreading more tolerant of audiovisual asynchrony.</p>","PeriodicalId":94182,"journal":{"name":"Psychophysiology","volume":" ","pages":"e14527"},"PeriodicalIF":0.0000,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The stimulus-driven and representation-driven cross-modal attentional spreading are both modulated by audiovisual temporal synchrony.\",\"authors\":\"Song Zhao, Fangfang Ma, Jimei Xie, Yuxin Zhou, Chengzhi Feng, Wenfeng Feng\",\"doi\":\"10.1111/psyp.14527\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Multisensory integration and attention can interact in a way that attention to the visual constituent of a multisensory object results in an attentional spreading to its ignored auditory constituent, which can be either stimulus-driven or representation-driven depending on whether the object's visual constituent receives extra representation-based selective attention. Previous research using simple unrelated audiovisual combinations has shown that the stimulus-driven attentional spreading is contingent on audiovisual temporal simultaneity. However, little is known about whether this temporal constraint applies also to the representation-driven attentional spreading, and whether it holds for the stimulus-driven process elicited by real-life multisensory objects. The current event-related potential study investigated these questions by systematically manipulating the visual-to-auditory stimulus onset asynchrony (SOA: 0/100/300 ms) in an object-selective visual recognition task wherein the representation-driven and stimulus-driven spreading processes, measured as two distinct auditory negative difference (Nd) components, could be isolated independently. Our results showed that both the representation-driven and stimulus-driven Nds decreased as the SOA increased. Interestingly, the representation-driven Nd was completely absent, whereas the stimulus-driven Nd was still robust, when the auditory constituents were delayed by 300 ms. These findings not only indicate that the role of audiovisual simultaneity in the representation-driven attentional spreading has been underestimated, but also suggest that learned associations between the unisensory constituents of real-life objects render the stimulus-driven attentional spreading more tolerant of audiovisual asynchrony.</p>\",\"PeriodicalId\":94182,\"journal\":{\"name\":\"Psychophysiology\",\"volume\":\" \",\"pages\":\"e14527\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Psychophysiology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1111/psyp.14527\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/1/19 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychophysiology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/psyp.14527","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/19 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
The stimulus-driven and representation-driven cross-modal attentional spreading are both modulated by audiovisual temporal synchrony.
Multisensory integration and attention can interact in a way that attention to the visual constituent of a multisensory object results in an attentional spreading to its ignored auditory constituent, which can be either stimulus-driven or representation-driven depending on whether the object's visual constituent receives extra representation-based selective attention. Previous research using simple unrelated audiovisual combinations has shown that the stimulus-driven attentional spreading is contingent on audiovisual temporal simultaneity. However, little is known about whether this temporal constraint applies also to the representation-driven attentional spreading, and whether it holds for the stimulus-driven process elicited by real-life multisensory objects. The current event-related potential study investigated these questions by systematically manipulating the visual-to-auditory stimulus onset asynchrony (SOA: 0/100/300 ms) in an object-selective visual recognition task wherein the representation-driven and stimulus-driven spreading processes, measured as two distinct auditory negative difference (Nd) components, could be isolated independently. Our results showed that both the representation-driven and stimulus-driven Nds decreased as the SOA increased. Interestingly, the representation-driven Nd was completely absent, whereas the stimulus-driven Nd was still robust, when the auditory constituents were delayed by 300 ms. These findings not only indicate that the role of audiovisual simultaneity in the representation-driven attentional spreading has been underestimated, but also suggest that learned associations between the unisensory constituents of real-life objects render the stimulus-driven attentional spreading more tolerant of audiovisual asynchrony.