Naturalistic multimodal emotion data with deep learning can advance the theoretical understanding of emotion.

IF 2.2 3区 心理学 Q2 PSYCHOLOGY, EXPERIMENTAL Psychological Research-Psychologische Forschung Pub Date : 2024-12-21 DOI:10.1007/s00426-024-02068-y
Thanakorn Angkasirisan
{"title":"Naturalistic multimodal emotion data with deep learning can advance the theoretical understanding of emotion.","authors":"Thanakorn Angkasirisan","doi":"10.1007/s00426-024-02068-y","DOIUrl":null,"url":null,"abstract":"<p><p>What are emotions? Despite being a century-old question, emotion scientists have yet to agree on what emotions exactly are. Emotions are diversely conceptualised as innate responses (evolutionary view), mental constructs (constructivist view), cognitive evaluations (appraisal view), or self-organising states (dynamical systems view). This enduring fragmentation likely stems from the limitations of traditional research methods, which often adopt narrow methodological approaches. Methods from artificial intelligence (AI), particularly those leveraging big data and deep learning, offer promising approaches for overcoming these limitations. By integrating data from multimodal markers of emotion, including subjective experiences, contextual factors, brain-bodily physiological signals and expressive behaviours, deep learning algorithms can uncover and map their complex relationships within multidimensional spaces. This multimodal emotion framework has the potential to provide novel, nuanced insights into long-standing questions, such as whether emotion categories are innate or learned and whether emotions exhibit coherence or degeneracy, thereby refining emotion theories. Significant challenges remain, particularly in obtaining comprehensive naturalistic multimodal emotion data, highlighting the need for advances in synchronous measurement of naturalistic multimodal emotion.</p>","PeriodicalId":48184,"journal":{"name":"Psychological Research-Psychologische Forschung","volume":"89 1","pages":"36"},"PeriodicalIF":2.2000,"publicationDate":"2024-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11663169/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Psychological Research-Psychologische Forschung","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1007/s00426-024-02068-y","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

Abstract

What are emotions? Despite being a century-old question, emotion scientists have yet to agree on what emotions exactly are. Emotions are diversely conceptualised as innate responses (evolutionary view), mental constructs (constructivist view), cognitive evaluations (appraisal view), or self-organising states (dynamical systems view). This enduring fragmentation likely stems from the limitations of traditional research methods, which often adopt narrow methodological approaches. Methods from artificial intelligence (AI), particularly those leveraging big data and deep learning, offer promising approaches for overcoming these limitations. By integrating data from multimodal markers of emotion, including subjective experiences, contextual factors, brain-bodily physiological signals and expressive behaviours, deep learning algorithms can uncover and map their complex relationships within multidimensional spaces. This multimodal emotion framework has the potential to provide novel, nuanced insights into long-standing questions, such as whether emotion categories are innate or learned and whether emotions exhibit coherence or degeneracy, thereby refining emotion theories. Significant challenges remain, particularly in obtaining comprehensive naturalistic multimodal emotion data, highlighting the need for advances in synchronous measurement of naturalistic multimodal emotion.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于深度学习的自然多模态情感数据可以促进对情感的理论理解。
什么是情绪?尽管这是一个有百年历史的问题,但情感科学家们尚未就情感到底是什么达成一致。情绪被不同地概念化为先天反应(进化观点)、心理构念(建构主义观点)、认知评价(评价观点)或自组织状态(动力系统观点)。这种持续的分裂可能源于传统研究方法的局限性,传统研究方法通常采用狭隘的方法方法。人工智能(AI)的方法,特别是利用大数据和深度学习的方法,为克服这些限制提供了有希望的方法。通过整合来自多模态情感标记的数据,包括主观体验、情境因素、脑-身体生理信号和表达行为,深度学习算法可以在多维空间中发现并绘制它们之间的复杂关系。这种多模态情绪框架有可能为长期存在的问题提供新颖、细致的见解,例如情绪类别是天生的还是后天的,以及情绪是否表现出连贯性或退化,从而完善情绪理论。重大的挑战仍然存在,特别是在获得全面的自然多模态情绪数据方面,突出了在同步测量自然多模态情绪方面取得进展的必要性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
5.10
自引率
8.70%
发文量
137
期刊介绍: Psychological Research/Psychologische Forschung publishes articles that contribute to a basic understanding of human perception, attention, memory, and action. The Journal is devoted to the dissemination of knowledge based on firm experimental ground, but not to particular approaches or schools of thought. Theoretical and historical papers are welcome to the extent that they serve this general purpose; papers of an applied nature are acceptable if they contribute to basic understanding or serve to bridge the often felt gap between basic and applied research in the field covered by the Journal.
期刊最新文献
Grounded cognition and the representation of momentum: abstract concepts modulate mislocalization. Can't help processing numbers with text: Eye-tracking evidence for simultaneous instead of sequential processing of text and numbers in arithmetic word problems. Effect of spatial training on space-number mapping: a situated cognition account. Action toward sound sources enhances auditory spatial confidence: on the metacognitive consequences of reaching to sounds. SNARC effect in a transfer paradigm: long-lasting effects of stimulus-response compatibility practices.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1