动作单元、视点和沉浸对动态虚拟人脸情感识别的影响。

Miguel A Vicente-Querol, Antonio Fernández-Caballero, Pascual González, Luz M González-Gualda, Patricia Fernández-Sotos, José P Molina, Arturo S García
{"title":"动作单元、视点和沉浸对动态虚拟人脸情感识别的影响。","authors":"Miguel A Vicente-Querol,&nbsp;Antonio Fernández-Caballero,&nbsp;Pascual González,&nbsp;Luz M González-Gualda,&nbsp;Patricia Fernández-Sotos,&nbsp;José P Molina,&nbsp;Arturo S García","doi":"10.1142/S0129065723500533","DOIUrl":null,"url":null,"abstract":"<p><p>Facial affect recognition is a critical skill in human interactions that is often impaired in psychiatric disorders. To address this challenge, tests have been developed to measure and train this skill. Recently, virtual human (VH) and virtual reality (VR) technologies have emerged as novel tools for this purpose. This study investigates the unique contributions of different factors in the communication and perception of emotions conveyed by VHs. Specifically, it examines the effects of the use of action units (AUs) in virtual faces, the positioning of the VH (frontal or mid-profile), and the level of immersion in the VR environment (desktop screen versus immersive VR). Thirty-six healthy subjects participated in each condition. Dynamic virtual faces (DVFs), VHs with facial animations, were used to represent the six basic emotions and the neutral expression. The results highlight the important role of the accurate implementation of AUs in virtual faces for emotion recognition. Furthermore, it is observed that frontal views outperform mid-profile views in both test conditions, while immersive VR shows a slight improvement in emotion recognition. This study provides novel insights into the influence of these factors on emotion perception and advances the understanding and application of these technologies for effective facial emotion recognition training.</p>","PeriodicalId":94052,"journal":{"name":"International journal of neural systems","volume":"33 10","pages":"2350053"},"PeriodicalIF":0.0000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Effect of Action Units, Viewpoint and Immersion on Emotion Recognition Using Dynamic Virtual Faces.\",\"authors\":\"Miguel A Vicente-Querol,&nbsp;Antonio Fernández-Caballero,&nbsp;Pascual González,&nbsp;Luz M González-Gualda,&nbsp;Patricia Fernández-Sotos,&nbsp;José P Molina,&nbsp;Arturo S García\",\"doi\":\"10.1142/S0129065723500533\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Facial affect recognition is a critical skill in human interactions that is often impaired in psychiatric disorders. To address this challenge, tests have been developed to measure and train this skill. Recently, virtual human (VH) and virtual reality (VR) technologies have emerged as novel tools for this purpose. This study investigates the unique contributions of different factors in the communication and perception of emotions conveyed by VHs. Specifically, it examines the effects of the use of action units (AUs) in virtual faces, the positioning of the VH (frontal or mid-profile), and the level of immersion in the VR environment (desktop screen versus immersive VR). Thirty-six healthy subjects participated in each condition. Dynamic virtual faces (DVFs), VHs with facial animations, were used to represent the six basic emotions and the neutral expression. The results highlight the important role of the accurate implementation of AUs in virtual faces for emotion recognition. Furthermore, it is observed that frontal views outperform mid-profile views in both test conditions, while immersive VR shows a slight improvement in emotion recognition. This study provides novel insights into the influence of these factors on emotion perception and advances the understanding and application of these technologies for effective facial emotion recognition training.</p>\",\"PeriodicalId\":94052,\"journal\":{\"name\":\"International journal of neural systems\",\"volume\":\"33 10\",\"pages\":\"2350053\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of neural systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1142/S0129065723500533\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of neural systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/S0129065723500533","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

面部情感识别是人类互动中的一项关键技能,在精神疾病中往往会受到损害。为了应对这一挑战,已经开发了测试来衡量和培训这一技能。最近,虚拟人(VH)和虚拟现实(VR)技术已经成为实现这一目的的新颖工具。本研究调查了不同因素在VHs传达的情感交流和感知中的独特贡献。具体而言,它考察了在虚拟人脸中使用动作单元(AU)的效果、VH的定位(正面或中间轮廓)以及在VR环境中的沉浸水平(桌面屏幕与沉浸式VR)。36名健康受试者参与了每种情况。动态虚拟人脸(DVF),即带有面部动画的VH,用于表示六种基本情绪和中性表情。研究结果突出了AU在虚拟人脸情感识别中的准确实现的重要作用。此外,观察到,在两种测试条件下,正面视图都优于中间视图,而沉浸式VR在情绪识别方面略有改善。这项研究为这些因素对情绪感知的影响提供了新的见解,并促进了对这些技术的理解和应用,以进行有效的面部情绪识别训练。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Effect of Action Units, Viewpoint and Immersion on Emotion Recognition Using Dynamic Virtual Faces.

Facial affect recognition is a critical skill in human interactions that is often impaired in psychiatric disorders. To address this challenge, tests have been developed to measure and train this skill. Recently, virtual human (VH) and virtual reality (VR) technologies have emerged as novel tools for this purpose. This study investigates the unique contributions of different factors in the communication and perception of emotions conveyed by VHs. Specifically, it examines the effects of the use of action units (AUs) in virtual faces, the positioning of the VH (frontal or mid-profile), and the level of immersion in the VR environment (desktop screen versus immersive VR). Thirty-six healthy subjects participated in each condition. Dynamic virtual faces (DVFs), VHs with facial animations, were used to represent the six basic emotions and the neutral expression. The results highlight the important role of the accurate implementation of AUs in virtual faces for emotion recognition. Furthermore, it is observed that frontal views outperform mid-profile views in both test conditions, while immersive VR shows a slight improvement in emotion recognition. This study provides novel insights into the influence of these factors on emotion perception and advances the understanding and application of these technologies for effective facial emotion recognition training.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A Hardware-Efficient Novelty-Aware Spike Sorting Approach for Brain-Implantable Microsystems. Author Index Volume 34 (2024). Precise Localization for Anatomo-Physiological Hallmarks of the Cervical Spine by Using Neural Memory Ordinary Differential Equation. The 2024 Hojjat Adeli Award for Outstanding Contributions in Neural Systems. Referring Image Segmentation with Multi-Modal Feature Interaction and Alignment Based on Convolutional Nonlinear Spiking Neural Membrane Systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1