LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition Using Graph Neural Networks

IF 9.8 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Affective Computing Pub Date : 2025-01-30 DOI:10.1109/TAFFC.2025.3537538
Gokul S Krishnan;Sarala Padi;Craig S. Greenberg;Balaraman Ravindran;Dinesh Manocha;Ram D. Sriram
{"title":"LineConGraphs: Line Conversation Graphs for Effective Emotion Recognition Using Graph Neural Networks","authors":"Gokul S Krishnan;Sarala Padi;Craig S. Greenberg;Balaraman Ravindran;Dinesh Manocha;Ram D. Sriram","doi":"10.1109/TAFFC.2025.3537538","DOIUrl":null,"url":null,"abstract":"Emotion Recognition in Conversations (ERC) is an important aspect of affective computing with practical applications in healthcare, education, chatbots, and social media platforms. Previous approaches to <inline-formula><tex-math>$\\text {ERC}$</tex-math></inline-formula> analysis involved using graph neural network architectures to model both speaker and long-term contextual information. In this paper, we introduce new models for <inline-formula><tex-math>$\\text {ERC}$</tex-math></inline-formula> analysis: the <i>LineConGCN</i> and <i>LineConGAT</i> models, which are constructed using a graph construction strategy for conversations called line conversational graphs (<i>LineConGraphs</i>). <i>LineConGraph</i> is designed to capture short-term conversational context using one previous and future utterance, while also capturing long-term context using GCN or GAT layers without explicitly integrating into the graph construction strategy. We evaluate the performance of our proposed models on two benchmark datasets, <inline-formula><tex-math>$\\text {IEMOCAP}$</tex-math></inline-formula> and <inline-formula><tex-math>$\\text {MELD}$</tex-math></inline-formula>, and show that our <i>LineConGAT</i> model outperforms the state-of-the-art methods with an F1-score of <inline-formula><tex-math>$\\text {64.58}\\%$</tex-math></inline-formula> and <inline-formula><tex-math>$\\text {76.50}\\%$</tex-math></inline-formula>. Furthermore, we demonstrate that incorporating sentiment shift information into line conversation graphs further enhances <inline-formula><tex-math>$\\text {ERC}$</tex-math></inline-formula> performance in the case of <i>LineConGCN</i> models. We also evaluate the performance of our proposed model by embedding speaker information into <i>LineConGCN</i> and <i>LineConGAT</i> models and show that <i>LineConGAT</i> and <i>LineConGAT</i> with speaker embeddings performed equally for ERC analysis.","PeriodicalId":13131,"journal":{"name":"IEEE Transactions on Affective Computing","volume":"16 3","pages":"1747-1759"},"PeriodicalIF":9.8000,"publicationDate":"2025-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Affective Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10858741/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Emotion Recognition in Conversations (ERC) is an important aspect of affective computing with practical applications in healthcare, education, chatbots, and social media platforms. Previous approaches to $\text {ERC}$ analysis involved using graph neural network architectures to model both speaker and long-term contextual information. In this paper, we introduce new models for $\text {ERC}$ analysis: the LineConGCN and LineConGAT models, which are constructed using a graph construction strategy for conversations called line conversational graphs (LineConGraphs). LineConGraph is designed to capture short-term conversational context using one previous and future utterance, while also capturing long-term context using GCN or GAT layers without explicitly integrating into the graph construction strategy. We evaluate the performance of our proposed models on two benchmark datasets, $\text {IEMOCAP}$ and $\text {MELD}$, and show that our LineConGAT model outperforms the state-of-the-art methods with an F1-score of $\text {64.58}\%$ and $\text {76.50}\%$. Furthermore, we demonstrate that incorporating sentiment shift information into line conversation graphs further enhances $\text {ERC}$ performance in the case of LineConGCN models. We also evaluate the performance of our proposed model by embedding speaker information into LineConGCN and LineConGAT models and show that LineConGAT and LineConGAT with speaker embeddings performed equally for ERC analysis.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
LineConGraphs:使用图神经网络进行有效情感识别的线对话图
对话中的情感识别(ERC)是情感计算的一个重要方面,在医疗保健、教育、聊天机器人和社交媒体平台中都有实际应用。以前的$\text {ERC}$分析方法涉及使用图神经网络架构对说话者和长期上下文信息进行建模。在本文中,我们引入了用于$\text {ERC}$分析的新模型:LineConGCN和LineConGAT模型,它们是使用称为线会话图(LineConGraphs)的对话图构建策略构建的。LineConGraph旨在使用一个先前和将来的话语捕获短期会话上下文,同时还使用GCN或GAT层捕获长期上下文,而无需显式集成到图构建策略中。我们在两个基准数据集$\text {IEMOCAP}$和$\text {MELD}$上评估了我们提出的模型的性能,并表明我们的LineConGAT模型优于最先进的方法,f1得分为$\text{64.58}\%$和$\text{76.50}\%$。此外,我们证明了在LineConGCN模型的情况下,将情感转移信息纳入线对话图进一步提高了$\text {ERC}$的性能。我们还通过将说话人信息嵌入到LineConGCN和LineConGAT模型中来评估我们所提出的模型的性能,并表明带有说话人嵌入的LineConGAT和LineConGAT在ERC分析中表现相同。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
15.00
自引率
6.20%
发文量
174
期刊介绍: The IEEE Transactions on Affective Computing is an international and interdisciplinary journal. Its primary goal is to share research findings on the development of systems capable of recognizing, interpreting, and simulating human emotions and related affective phenomena. The journal publishes original research on the underlying principles and theories that explain how and why affective factors shape human-technology interactions. It also focuses on how techniques for sensing and simulating affect can enhance our understanding of human emotions and processes. Additionally, the journal explores the design, implementation, and evaluation of systems that prioritize the consideration of affect in their usability. We also welcome surveys of existing work that provide new perspectives on the historical and future directions of this field.
期刊最新文献
Hierarchical Vision-Language Interaction for Facial Action Unit Detection SENSE-7: Taxonomy and Dataset for Measuring User Perceptions of Empathy in Sustained Human-AI Conversations Assessing the Representation of Suicidal Ideation in Social Media Datasets Relative to Suicide Notes CMCRD: Cross-Modal Contrastive Representation Distillation for Emotion Recognition SMA-EL:a Minimal 1-cycle Construction Algorithm with Simplicial Maps Annotation and Edge Loss for Emotional Brain Networks Analysis
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1