发音的神经基础符号伪词的形状对应。

IF 2 3区 心理学 Q3 BEHAVIORAL SCIENCES Neuropsychologia Pub Date : 2023-09-09 DOI:10.1016/j.neuropsychologia.2023.108657
Deborah A. Barany , Simon Lacey , Kaitlyn L. Matthews , Lynne C. Nygaard , K. Sathian
{"title":"发音的神经基础符号伪词的形状对应。","authors":"Deborah A. Barany ,&nbsp;Simon Lacey ,&nbsp;Kaitlyn L. Matthews ,&nbsp;Lynne C. Nygaard ,&nbsp;K. Sathian","doi":"10.1016/j.neuropsychologia.2023.108657","DOIUrl":null,"url":null,"abstract":"<div><p><span><span>Non-arbitrary mapping between the sound of a word and its meaning, termed sound symbolism, is commonly studied through crossmodal correspondences between sounds and visual shapes, e.g., auditory pseudowords, like ‘mohloh’ and ‘kehteh’, are matched to rounded and pointed visual shapes, respectively. Here, we used functional magnetic resonance imaging (fMRI) during a crossmodal matching task to investigate the hypotheses that sound symbolism (1) involves language processing; (2) depends on multisensory integration; (3) reflects embodiment of speech in hand movements. These hypotheses lead to corresponding neuroanatomical predictions of crossmodal congruency effects in (1) the language network; (2) areas mediating multisensory processing, including visual and </span>auditory cortex<span>; (3) regions responsible for sensorimotor control of the hand and mouth. Right-handed participants (</span></span><em>n</em><span><span> = 22) encountered audiovisual stimuli comprising a simultaneously presented visual shape (rounded or pointed) and an auditory pseudoword (‘mohloh’ or ‘kehteh’) and indicated via a right-hand keypress whether the stimuli matched or not. Reaction times were faster for congruent than incongruent stimuli. Univariate analysis showed that activity was greater for the congruent compared to the incongruent condition in the left primary and association auditory cortex, and left anterior fusiform/parahippocampal </span>gyri. Multivoxel pattern analysis revealed higher classification accuracy for the audiovisual stimuli when congruent than when incongruent, in the pars opercularis of the left inferior frontal (Broca's area), the left supramarginal, and the right mid-occipital gyri. These findings, considered in relation to the neuroanatomical predictions, support the first two hypotheses and suggest that sound symbolism involves both language processing and multisensory integration.</span></p></div>","PeriodicalId":19279,"journal":{"name":"Neuropsychologia","volume":"188 ","pages":"Article 108657"},"PeriodicalIF":2.0000,"publicationDate":"2023-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10529692/pdf/","citationCount":"0","resultStr":"{\"title\":\"Neural basis of sound-symbolic pseudoword-shape correspondences\",\"authors\":\"Deborah A. Barany ,&nbsp;Simon Lacey ,&nbsp;Kaitlyn L. Matthews ,&nbsp;Lynne C. Nygaard ,&nbsp;K. Sathian\",\"doi\":\"10.1016/j.neuropsychologia.2023.108657\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p><span><span>Non-arbitrary mapping between the sound of a word and its meaning, termed sound symbolism, is commonly studied through crossmodal correspondences between sounds and visual shapes, e.g., auditory pseudowords, like ‘mohloh’ and ‘kehteh’, are matched to rounded and pointed visual shapes, respectively. Here, we used functional magnetic resonance imaging (fMRI) during a crossmodal matching task to investigate the hypotheses that sound symbolism (1) involves language processing; (2) depends on multisensory integration; (3) reflects embodiment of speech in hand movements. These hypotheses lead to corresponding neuroanatomical predictions of crossmodal congruency effects in (1) the language network; (2) areas mediating multisensory processing, including visual and </span>auditory cortex<span>; (3) regions responsible for sensorimotor control of the hand and mouth. Right-handed participants (</span></span><em>n</em><span><span> = 22) encountered audiovisual stimuli comprising a simultaneously presented visual shape (rounded or pointed) and an auditory pseudoword (‘mohloh’ or ‘kehteh’) and indicated via a right-hand keypress whether the stimuli matched or not. Reaction times were faster for congruent than incongruent stimuli. Univariate analysis showed that activity was greater for the congruent compared to the incongruent condition in the left primary and association auditory cortex, and left anterior fusiform/parahippocampal </span>gyri. Multivoxel pattern analysis revealed higher classification accuracy for the audiovisual stimuli when congruent than when incongruent, in the pars opercularis of the left inferior frontal (Broca's area), the left supramarginal, and the right mid-occipital gyri. These findings, considered in relation to the neuroanatomical predictions, support the first two hypotheses and suggest that sound symbolism involves both language processing and multisensory integration.</span></p></div>\",\"PeriodicalId\":19279,\"journal\":{\"name\":\"Neuropsychologia\",\"volume\":\"188 \",\"pages\":\"Article 108657\"},\"PeriodicalIF\":2.0000,\"publicationDate\":\"2023-09-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10529692/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neuropsychologia\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0028393223001914\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"BEHAVIORAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neuropsychologia","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0028393223001914","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"BEHAVIORAL SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

一个单词的声音与其含义之间的非任意映射,称为声音象征,通常通过声音和视觉形状之间的跨模态对应来研究,例如,听觉假名,如“mohloh”和“kehteh”,分别与圆形和尖头视觉形状相匹配。在这里,我们在跨模态匹配任务中使用了功能性磁共振成像(fMRI)来研究声音象征(1)涉及语言处理的假设;(2) 依赖于多感官的整合;(3) 反映了言语在手部动作中的体现。这些假设导致了对(1)语言网络中跨模态一致性效应的相应神经解剖学预测;(2) 介导多感官处理的区域,包括视觉和听觉皮层;(3) 负责手和嘴的感觉运动控制的区域。右手参与者(n=22)遇到视听刺激,包括同时呈现的视觉形状(圆形或尖头)和听觉假名(“hloh”或“kehteh”),并通过右手按键指示刺激是否匹配。一致刺激的反应时间比不一致刺激更快。单变量分析显示,与不一致条件相比,一致条件下左侧初级和关联听觉皮层以及左侧前梭形/海马旁回的活动更大。多体素模式分析显示,在左额下叶(布罗卡区)、左缘上叶和右枕中回的操纵部,一致时的视听刺激分类精度高于不一致时的分类精度。这些发现与神经解剖学预测有关,支持了前两个假设,并表明声音象征主义涉及语言处理和多感官整合。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Neural basis of sound-symbolic pseudoword-shape correspondences

Non-arbitrary mapping between the sound of a word and its meaning, termed sound symbolism, is commonly studied through crossmodal correspondences between sounds and visual shapes, e.g., auditory pseudowords, like ‘mohloh’ and ‘kehteh’, are matched to rounded and pointed visual shapes, respectively. Here, we used functional magnetic resonance imaging (fMRI) during a crossmodal matching task to investigate the hypotheses that sound symbolism (1) involves language processing; (2) depends on multisensory integration; (3) reflects embodiment of speech in hand movements. These hypotheses lead to corresponding neuroanatomical predictions of crossmodal congruency effects in (1) the language network; (2) areas mediating multisensory processing, including visual and auditory cortex; (3) regions responsible for sensorimotor control of the hand and mouth. Right-handed participants (n = 22) encountered audiovisual stimuli comprising a simultaneously presented visual shape (rounded or pointed) and an auditory pseudoword (‘mohloh’ or ‘kehteh’) and indicated via a right-hand keypress whether the stimuli matched or not. Reaction times were faster for congruent than incongruent stimuli. Univariate analysis showed that activity was greater for the congruent compared to the incongruent condition in the left primary and association auditory cortex, and left anterior fusiform/parahippocampal gyri. Multivoxel pattern analysis revealed higher classification accuracy for the audiovisual stimuli when congruent than when incongruent, in the pars opercularis of the left inferior frontal (Broca's area), the left supramarginal, and the right mid-occipital gyri. These findings, considered in relation to the neuroanatomical predictions, support the first two hypotheses and suggest that sound symbolism involves both language processing and multisensory integration.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neuropsychologia
Neuropsychologia 医学-行为科学
CiteScore
5.10
自引率
3.80%
发文量
228
审稿时长
4 months
期刊介绍: Neuropsychologia is an international interdisciplinary journal devoted to experimental and theoretical contributions that advance understanding of human cognition and behavior from a neuroscience perspective. The journal will consider for publication studies that link brain function with cognitive processes, including attention and awareness, action and motor control, executive functions and cognitive control, memory, language, and emotion and social cognition.
期刊最新文献
Primary manipulation knowledge of objects is associated with the functional coupling of pMTG and aIPS Temporal dynamics of implicit moral evaluation: From empathy for pain to mentalizing processes Neuroimaging and perceptual-cognitive expertise in sport: A narrative review of research and future directions Working memory load increases movement-related alpha and beta desynchronization Editorial Board
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1