年龄和性别以及句法单元的复杂性在语音情感感知中的作用。

IF 0.9 Q4 AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY CoDAS Pub Date : 2024-07-19 eCollection Date: 2024-01-01 DOI:10.1590/2317-1782/20242024009en
Baiba Trinite, Anita Zdanovica, Daiga Kurme, Evija Lavrane, Ilva Magazeina, Anita Jansone
{"title":"年龄和性别以及句法单元的复杂性在语音情感感知中的作用。","authors":"Baiba Trinite, Anita Zdanovica, Daiga Kurme, Evija Lavrane, Ilva Magazeina, Anita Jansone","doi":"10.1590/2317-1782/20242024009en","DOIUrl":null,"url":null,"abstract":"<p><strong>Purpose: </strong>The study aimed to identify (1) whether the age and gender of listeners and the length of vocal stimuli affect emotion discrimination accuracy in voice; and (2) whether the determined level of expression of perceived affective emotions is age and gender-dependent.</p><p><strong>Methods: </strong>Thirty-two age-matched listeners listened to 270 semantically neutral voice samples produced in neutral, happy, and angry intonation by ten professional actors. The participants were required to categorize the auditory stimulus based on three options and judge the intensity of emotional expression in the sample using a customized tablet web interface.</p><p><strong>Results: </strong>The discrimination accuracy of happy and angry emotions decreased with age, while accuracy in discriminating neutral emotions increased with age. Females rated the intensity level of perceived affective emotions higher than males across all linguistic units. These were: for angry emotions in words (z = -3.599, p < .001), phrases (z = -3.218, p = .001), and texts (z = -2.272, p = .023), for happy emotions in words (z = -5.799, p < .001), phrases (z = -4.706, p < .001), and texts (z = -2.699, p = .007).</p><p><strong>Conclusion: </strong>Accuracy in perceiving vocal expressions of emotions varies according to age and gender. Young adults are better at distinguishing happy and angry emotions than middle-aged adults, while middle-aged adults tend to categorize perceived affective emotions as neutral. Gender also plays a role, with females rating expressions of affective emotions in voices higher than males. Additionally, the length of voice stimuli impacts emotion discrimination accuracy.</p>","PeriodicalId":46547,"journal":{"name":"CoDAS","volume":null,"pages":null},"PeriodicalIF":0.9000,"publicationDate":"2024-07-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11340876/pdf/","citationCount":"0","resultStr":"{\"title\":\"The role of the age and gender, and the complexity of the syntactic unit in the perception of affective emotions in voice.\",\"authors\":\"Baiba Trinite, Anita Zdanovica, Daiga Kurme, Evija Lavrane, Ilva Magazeina, Anita Jansone\",\"doi\":\"10.1590/2317-1782/20242024009en\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Purpose: </strong>The study aimed to identify (1) whether the age and gender of listeners and the length of vocal stimuli affect emotion discrimination accuracy in voice; and (2) whether the determined level of expression of perceived affective emotions is age and gender-dependent.</p><p><strong>Methods: </strong>Thirty-two age-matched listeners listened to 270 semantically neutral voice samples produced in neutral, happy, and angry intonation by ten professional actors. The participants were required to categorize the auditory stimulus based on three options and judge the intensity of emotional expression in the sample using a customized tablet web interface.</p><p><strong>Results: </strong>The discrimination accuracy of happy and angry emotions decreased with age, while accuracy in discriminating neutral emotions increased with age. Females rated the intensity level of perceived affective emotions higher than males across all linguistic units. These were: for angry emotions in words (z = -3.599, p < .001), phrases (z = -3.218, p = .001), and texts (z = -2.272, p = .023), for happy emotions in words (z = -5.799, p < .001), phrases (z = -4.706, p < .001), and texts (z = -2.699, p = .007).</p><p><strong>Conclusion: </strong>Accuracy in perceiving vocal expressions of emotions varies according to age and gender. Young adults are better at distinguishing happy and angry emotions than middle-aged adults, while middle-aged adults tend to categorize perceived affective emotions as neutral. Gender also plays a role, with females rating expressions of affective emotions in voices higher than males. Additionally, the length of voice stimuli impacts emotion discrimination accuracy.</p>\",\"PeriodicalId\":46547,\"journal\":{\"name\":\"CoDAS\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.9000,\"publicationDate\":\"2024-07-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11340876/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"CoDAS\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1590/2317-1782/20242024009en\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q4\",\"JCRName\":\"AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"CoDAS","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1590/2317-1782/20242024009en","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q4","JCRName":"AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

目的:本研究旨在确定:(1)听者的年龄和性别以及声音刺激的长度是否会影响语音情绪辨别的准确性;(2)感知到的情感表达水平是否与年龄和性别有关:方法:32 名年龄匹配的听者聆听了 270 个语义中性的语音样本,这些样本由 10 名专业演员以中性、快乐和愤怒的语调发出。参与者需要根据三个选项对听觉刺激进行分类,并使用定制的平板电脑网络界面判断样本中情绪表达的强度:结果:对快乐和愤怒情绪的辨别准确率随着年龄的增长而降低,而对中性情绪的辨别准确率则随着年龄的增长而提高。在所有语言单位中,女性对所感知的情绪强度的评价均高于男性。这包括:愤怒情绪的单词(z = -3.599,p < .001)、短语(z = -3.218,p = .001)和文本(z = -2.272,p = .023);快乐情绪的单词(z = -5.799,p < .001)、短语(z = -4.706,p < .001)和文本(z = -2.699,p = .007):结论:感知声音情绪表达的准确性因年龄和性别而异。年轻人比中年人更善于区分快乐和愤怒的情绪,而中年人则倾向于将感知到的情绪归类为中性。性别也有影响,女性对声音中情感表达的评价高于男性。此外,声音刺激的长度也会影响情绪辨别的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
The role of the age and gender, and the complexity of the syntactic unit in the perception of affective emotions in voice.

Purpose: The study aimed to identify (1) whether the age and gender of listeners and the length of vocal stimuli affect emotion discrimination accuracy in voice; and (2) whether the determined level of expression of perceived affective emotions is age and gender-dependent.

Methods: Thirty-two age-matched listeners listened to 270 semantically neutral voice samples produced in neutral, happy, and angry intonation by ten professional actors. The participants were required to categorize the auditory stimulus based on three options and judge the intensity of emotional expression in the sample using a customized tablet web interface.

Results: The discrimination accuracy of happy and angry emotions decreased with age, while accuracy in discriminating neutral emotions increased with age. Females rated the intensity level of perceived affective emotions higher than males across all linguistic units. These were: for angry emotions in words (z = -3.599, p < .001), phrases (z = -3.218, p = .001), and texts (z = -2.272, p = .023), for happy emotions in words (z = -5.799, p < .001), phrases (z = -4.706, p < .001), and texts (z = -2.699, p = .007).

Conclusion: Accuracy in perceiving vocal expressions of emotions varies according to age and gender. Young adults are better at distinguishing happy and angry emotions than middle-aged adults, while middle-aged adults tend to categorize perceived affective emotions as neutral. Gender also plays a role, with females rating expressions of affective emotions in voices higher than males. Additionally, the length of voice stimuli impacts emotion discrimination accuracy.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CoDAS
CoDAS AUDIOLOGY & SPEECH-LANGUAGE PATHOLOGY-
CiteScore
0.90
自引率
12.50%
发文量
103
审稿时长
30 weeks
期刊最新文献
Development and validation of Competing Sentence Test in Kannada. Activity ordering task: conceptualization and development of a novel context-based working memory task with a metacognitive facet. Development of a Comprehensive Cough Therapy Program (CCTP) for chronic cough in India: a qualitative study. Speech perception in the Specific Learning Disorder with and without Persistent Speech Sound Disorder. Analysis of brain activity for speech stimuli and child development of an infant with neurosyphilis: case report.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1