{"title":"语音和非语音刺激的视听整合神经动力学:一项心理物理学研究","authors":"Nicholas A. Altieri","doi":"10.2174/1874082001307010005","DOIUrl":null,"url":null,"abstract":"This study investigated the extent to which audiovisual speech integration is special by comparing behavioral and neural measures using both speech and non-speech stimuli. An audiovisual recognition experiment presenting listen- ers with auditory, visual, and audiovisual stimuli was implemented. The auditory component consisted of sine wave speech, and the visual component consisted of point light displays, which include point-light dots that highlight a talker's points of articulation. In the first phase, listeners engaged in a discrimination task where they were unaware of the linguis- tic nature of the auditory and visual stimuli. In the second phase, they were informed that the auditory and visual stimuli were spoken utterances of /be/ (\"bay\") and /de/ (\"day\"), and they engaged in the same task. The neural dynamics of audiovisual integration was investigated by utilizing EEG, including mean Global Field Power and current density recon- struction (CDR). As predicted, support for divergent regions of multisensory integration between the speech and non- speech stimuli was obtained, namely greater posterior parietal activation in the non-speech condition. Conversely, reac- tion-time measures indicated qualitatively similar multisensory integration across experimental conditions.","PeriodicalId":88753,"journal":{"name":"The open neuroscience journal","volume":"97 3 1","pages":"5-18"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Neural Dynamics of Audiovisual Integration for Speech and Non-Speech Stimuli: A Psychophysical Study\",\"authors\":\"Nicholas A. Altieri\",\"doi\":\"10.2174/1874082001307010005\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This study investigated the extent to which audiovisual speech integration is special by comparing behavioral and neural measures using both speech and non-speech stimuli. An audiovisual recognition experiment presenting listen- ers with auditory, visual, and audiovisual stimuli was implemented. The auditory component consisted of sine wave speech, and the visual component consisted of point light displays, which include point-light dots that highlight a talker's points of articulation. In the first phase, listeners engaged in a discrimination task where they were unaware of the linguis- tic nature of the auditory and visual stimuli. In the second phase, they were informed that the auditory and visual stimuli were spoken utterances of /be/ (\\\"bay\\\") and /de/ (\\\"day\\\"), and they engaged in the same task. The neural dynamics of audiovisual integration was investigated by utilizing EEG, including mean Global Field Power and current density recon- struction (CDR). As predicted, support for divergent regions of multisensory integration between the speech and non- speech stimuli was obtained, namely greater posterior parietal activation in the non-speech condition. Conversely, reac- tion-time measures indicated qualitatively similar multisensory integration across experimental conditions.\",\"PeriodicalId\":88753,\"journal\":{\"name\":\"The open neuroscience journal\",\"volume\":\"97 3 1\",\"pages\":\"5-18\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-10-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The open neuroscience journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2174/1874082001307010005\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The open neuroscience journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2174/1874082001307010005","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Neural Dynamics of Audiovisual Integration for Speech and Non-Speech Stimuli: A Psychophysical Study
This study investigated the extent to which audiovisual speech integration is special by comparing behavioral and neural measures using both speech and non-speech stimuli. An audiovisual recognition experiment presenting listen- ers with auditory, visual, and audiovisual stimuli was implemented. The auditory component consisted of sine wave speech, and the visual component consisted of point light displays, which include point-light dots that highlight a talker's points of articulation. In the first phase, listeners engaged in a discrimination task where they were unaware of the linguis- tic nature of the auditory and visual stimuli. In the second phase, they were informed that the auditory and visual stimuli were spoken utterances of /be/ ("bay") and /de/ ("day"), and they engaged in the same task. The neural dynamics of audiovisual integration was investigated by utilizing EEG, including mean Global Field Power and current density recon- struction (CDR). As predicted, support for divergent regions of multisensory integration between the speech and non- speech stimuli was obtained, namely greater posterior parietal activation in the non-speech condition. Conversely, reac- tion-time measures indicated qualitatively similar multisensory integration across experimental conditions.