Serena Giurgola , Emanuele Lo Gerfo , Alessandro Farnè , Alice C. Roy , Nadia Bolognini
{"title":"初级运动皮层的多感官整合与运动共振","authors":"Serena Giurgola , Emanuele Lo Gerfo , Alessandro Farnè , Alice C. Roy , Nadia Bolognini","doi":"10.1016/j.cortex.2024.07.015","DOIUrl":null,"url":null,"abstract":"<div><p>Humans are endowed with a motor system that resonates to speech sounds, but whether concurrent visual information from lip movements can improve speech perception at a motor level through multisensory integration mechanisms remains unknown. Therefore, the aim of the study was to explore behavioral and neurophysiological correlates of multisensory influences on motor resonance in speech perception. Motor-evoked potentials (MEPs), by single pulse transcranial magnetic stimulation (TMS) applied over the left lip muscle (orbicularis oris) representation in the primary motor cortex, were recorded in healthy participants during the presentation of syllables in unimodal (visual or auditory) or multisensory (audio-visual) congruent or incongruent conditions. At the behavioral level, subjects showed better syllable identification in the congruent audio-visual condition as compared to the unimodal conditions, hence showing a multisensory enhancement effect. Accordingly, at the neurophysiological level, increased MEPs amplitudes were found in the congruent audio-visual condition, as compared to the unimodal ones. Incongruent audio-visual syllables resulting in illusory percepts did not increase corticospinal excitability, which in fact was comparable to that induced by the real perception of the same syllable. In conclusion, seeing and hearing congruent bilabial syllables increases the excitability of the lip representation in the primary motor cortex, hence documenting that multisensory integration can facilitate speech processing by influencing motor resonance. These findings highlight the modulation role of multisensory processing showing that it can boost speech perception and that multisensory interactions occur not only within higher-order regions, but also within primary motor areas, as shown by corticospinal excitability changes.</p></div>","PeriodicalId":10758,"journal":{"name":"Cortex","volume":"179 ","pages":"Pages 235-246"},"PeriodicalIF":3.2000,"publicationDate":"2024-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multisensory integration and motor resonance in the primary motor cortex\",\"authors\":\"Serena Giurgola , Emanuele Lo Gerfo , Alessandro Farnè , Alice C. Roy , Nadia Bolognini\",\"doi\":\"10.1016/j.cortex.2024.07.015\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Humans are endowed with a motor system that resonates to speech sounds, but whether concurrent visual information from lip movements can improve speech perception at a motor level through multisensory integration mechanisms remains unknown. Therefore, the aim of the study was to explore behavioral and neurophysiological correlates of multisensory influences on motor resonance in speech perception. Motor-evoked potentials (MEPs), by single pulse transcranial magnetic stimulation (TMS) applied over the left lip muscle (orbicularis oris) representation in the primary motor cortex, were recorded in healthy participants during the presentation of syllables in unimodal (visual or auditory) or multisensory (audio-visual) congruent or incongruent conditions. At the behavioral level, subjects showed better syllable identification in the congruent audio-visual condition as compared to the unimodal conditions, hence showing a multisensory enhancement effect. Accordingly, at the neurophysiological level, increased MEPs amplitudes were found in the congruent audio-visual condition, as compared to the unimodal ones. Incongruent audio-visual syllables resulting in illusory percepts did not increase corticospinal excitability, which in fact was comparable to that induced by the real perception of the same syllable. In conclusion, seeing and hearing congruent bilabial syllables increases the excitability of the lip representation in the primary motor cortex, hence documenting that multisensory integration can facilitate speech processing by influencing motor resonance. These findings highlight the modulation role of multisensory processing showing that it can boost speech perception and that multisensory interactions occur not only within higher-order regions, but also within primary motor areas, as shown by corticospinal excitability changes.</p></div>\",\"PeriodicalId\":10758,\"journal\":{\"name\":\"Cortex\",\"volume\":\"179 \",\"pages\":\"Pages 235-246\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2024-08-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cortex\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S001094522400220X\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"BEHAVIORAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cortex","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S001094522400220X","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BEHAVIORAL SCIENCES","Score":null,"Total":0}
Multisensory integration and motor resonance in the primary motor cortex
Humans are endowed with a motor system that resonates to speech sounds, but whether concurrent visual information from lip movements can improve speech perception at a motor level through multisensory integration mechanisms remains unknown. Therefore, the aim of the study was to explore behavioral and neurophysiological correlates of multisensory influences on motor resonance in speech perception. Motor-evoked potentials (MEPs), by single pulse transcranial magnetic stimulation (TMS) applied over the left lip muscle (orbicularis oris) representation in the primary motor cortex, were recorded in healthy participants during the presentation of syllables in unimodal (visual or auditory) or multisensory (audio-visual) congruent or incongruent conditions. At the behavioral level, subjects showed better syllable identification in the congruent audio-visual condition as compared to the unimodal conditions, hence showing a multisensory enhancement effect. Accordingly, at the neurophysiological level, increased MEPs amplitudes were found in the congruent audio-visual condition, as compared to the unimodal ones. Incongruent audio-visual syllables resulting in illusory percepts did not increase corticospinal excitability, which in fact was comparable to that induced by the real perception of the same syllable. In conclusion, seeing and hearing congruent bilabial syllables increases the excitability of the lip representation in the primary motor cortex, hence documenting that multisensory integration can facilitate speech processing by influencing motor resonance. These findings highlight the modulation role of multisensory processing showing that it can boost speech perception and that multisensory interactions occur not only within higher-order regions, but also within primary motor areas, as shown by corticospinal excitability changes.
期刊介绍:
CORTEX is an international journal devoted to the study of cognition and of the relationship between the nervous system and mental processes, particularly as these are reflected in the behaviour of patients with acquired brain lesions, normal volunteers, children with typical and atypical development, and in the activation of brain regions and systems as recorded by functional neuroimaging techniques. It was founded in 1964 by Ennio De Renzi.