MECA: Manipulation With Emotional Intensity-Aware Contrastive Learning and Attention-Based Discriminative Learning

IF 9.8 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Affective Computing Pub Date : 2024-11-07 DOI:10.1109/TAFFC.2024.3493416
Seongho Kim;Byung Cheol Song
{"title":"MECA: Manipulation With Emotional Intensity-Aware Contrastive Learning and Attention-Based Discriminative Learning","authors":"Seongho Kim;Byung Cheol Song","doi":"10.1109/TAFFC.2024.3493416","DOIUrl":null,"url":null,"abstract":"With recent developments in deep learning, facial expression manipulation (FEM) has become one of the fields receiving great attention. However, many studies focus on learning without considering class distinction in latent space. This paper introduces a representation learning scheme that leverages self-attention and mutual information to effectively account for semantic attributes, such as facial expressions, in the FEM task. Our framework, utilizing attention-based discriminative learning and emotional intensity-aware contrastive learning, is capable of forming a compact embedding space. This compact embedding space can lead to more discerning and richer facial expression synthesis in actual synthesis results. As a result, we have derived facial expression synthesis results that are superior to the previous methods. Also, in terms of the FED metric, which can quantify the degree of facial expression expression in FEM, the proposed method outperforms the other methods. To demonstrate this successful result, we use t-SNE and visualize the actual embedding results for each class. Furthermore, we prove that the latent space formed through the proposed method is also helpful in terms of facial expression recognition.","PeriodicalId":13131,"journal":{"name":"IEEE Transactions on Affective Computing","volume":"16 2","pages":"1104-1116"},"PeriodicalIF":9.8000,"publicationDate":"2024-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Affective Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10746615/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

With recent developments in deep learning, facial expression manipulation (FEM) has become one of the fields receiving great attention. However, many studies focus on learning without considering class distinction in latent space. This paper introduces a representation learning scheme that leverages self-attention and mutual information to effectively account for semantic attributes, such as facial expressions, in the FEM task. Our framework, utilizing attention-based discriminative learning and emotional intensity-aware contrastive learning, is capable of forming a compact embedding space. This compact embedding space can lead to more discerning and richer facial expression synthesis in actual synthesis results. As a result, we have derived facial expression synthesis results that are superior to the previous methods. Also, in terms of the FED metric, which can quantify the degree of facial expression expression in FEM, the proposed method outperforms the other methods. To demonstrate this successful result, we use t-SNE and visualize the actual embedding results for each class. Furthermore, we prove that the latent space formed through the proposed method is also helpful in terms of facial expression recognition.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
MECA:利用情感强度感知对比学习和基于注意力的判别学习进行操控
随着深度学习的发展,面部表情处理(FEM)已成为备受关注的领域之一。然而,许多研究都集中在学习上,而没有考虑潜在空间中的类别区分。本文介绍了一种表征学习方案,该方案利用自注意和互信息来有效地解释FEM任务中的语义属性,如面部表情。我们的框架,利用基于注意的判别学习和情感强度感知的对比学习,能够形成一个紧凑的嵌入空间。这种紧凑的嵌入空间可以在实际合成结果中获得更有辨识度和更丰富的面部表情合成。因此,我们得到了优于以往方法的面部表情合成结果。此外,在量化FEM中面部表情表达程度的FED度量方面,该方法优于其他方法。为了演示这个成功的结果,我们使用t-SNE并可视化每个类的实际嵌入结果。此外,我们还证明了通过该方法形成的潜在空间在面部表情识别方面也有帮助。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Affective Computing
IEEE Transactions on Affective Computing COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
15.00
自引率
6.20%
发文量
174
期刊介绍: The IEEE Transactions on Affective Computing is an international and interdisciplinary journal. Its primary goal is to share research findings on the development of systems capable of recognizing, interpreting, and simulating human emotions and related affective phenomena. The journal publishes original research on the underlying principles and theories that explain how and why affective factors shape human-technology interactions. It also focuses on how techniques for sensing and simulating affect can enhance our understanding of human emotions and processes. Additionally, the journal explores the design, implementation, and evaluation of systems that prioritize the consideration of affect in their usability. We also welcome surveys of existing work that provide new perspectives on the historical and future directions of this field.
期刊最新文献
Nasal Dominance and Nostril Breathing Variability: Potential Biomarkers of Acute Stress Charting the Unspoken: Causal Inference-Guided LLM Augmentation for Emotion Recognition in Conversation R2G $^{3}$ Net: A Novel Hierarchical Spatial-Temporal Neural Network With a Regional-to-Global Fusion Mechanism for Multimodal Emotion Recognition Exploring Spontaneous Facial Micro-expressions in On-road Driver Behavior EmoAgent: A Multi-Agent Framework for Diverse Affective Image Manipulation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1