András Bálint, Wilhelm Wimmer, Marco Caversaccio, Christian Rummel, Stefan Weder
{"title":"Brain activation patterns in normal hearing adults: An fNIRS Study using an adapted clinical speech comprehension task.","authors":"András Bálint, Wilhelm Wimmer, Marco Caversaccio, Christian Rummel, Stefan Weder","doi":"10.1016/j.heares.2024.109155","DOIUrl":null,"url":null,"abstract":"<p><strong>Objectives: </strong>Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures.</p><p><strong>Design: </strong>Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings.</p><p><strong>Results: </strong>In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort.</p><p><strong>Conclusions: </strong>We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.</p>","PeriodicalId":12881,"journal":{"name":"Hearing Research","volume":"455 ","pages":"109155"},"PeriodicalIF":2.5000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Hearing Research","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1016/j.heares.2024.109155","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/11/30 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Objectives: Understanding brain processing of auditory and visual speech is essential for advancing speech perception research and improving clinical interventions for individuals with hearing impairment. Functional near-infrared spectroscopy (fNIRS) is deemed to be highly suitable for measuring brain activity during language tasks. However, accurate data interpretation also requires validated stimuli and behavioral measures.
Design: Twenty-six adults with normal hearing listened to sentences from the Oldenburg Sentence Test (OLSA), and brain activation in the temporal, occipital, and prefrontal areas was measured by fNIRS. The sentences were presented in one of the four different modalities: speech-in-quiet, speech-in-noise, audiovisual speech or visual speech (i.e., lipreading). To support the interpretation of our fNIRS data, and to obtain a more comprehensive understanding of the study population, we performed hearing tests (pure tone and speech audiometry) and collected behavioral data using validated questionnaires, in-task comprehension questions, and listening effort ratings.
Results: In the auditory conditions (i.e., speech-in-quiet and speech-in-noise), we observed cortical activity in the temporal regions bilaterally. During the visual speech condition, we measured significant activation in the occipital area. Following the audiovisual condition, cortical activation was observed in both regions. Furthermore, we established a baseline for how individuals with normal hearing process visual cues during lipreading, and we found higher activity in the prefrontal cortex in noise conditions compared to quiet conditions, linked to higher listening effort.
Conclusions: We demonstrated the applicability of a clinically inspired audiovisual speech-comprehension task in participants with normal hearing. The measured brain activation patterns were supported and complemented by objective and behavioral parameters.
期刊介绍:
The aim of the journal is to provide a forum for papers concerned with basic peripheral and central auditory mechanisms. Emphasis is on experimental and clinical studies, but theoretical and methodological papers will also be considered. The journal publishes original research papers, review and mini- review articles, rapid communications, method/protocol and perspective articles.
Papers submitted should deal with auditory anatomy, physiology, psychophysics, imaging, modeling and behavioural studies in animals and humans, as well as hearing aids and cochlear implants. Papers dealing with the vestibular system are also considered for publication. Papers on comparative aspects of hearing and on effects of drugs and environmental contaminants on hearing function will also be considered. Clinical papers will be accepted when they contribute to the understanding of normal and pathological hearing functions.