Validation of the Emotionally Congruent and Incongruent Face-Body Static Set (ECIFBSS).

IF 4.6 2区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL Behavior Research Methods Pub Date : 2025-01-03 DOI:10.3758/s13428-024-02550-w
Anne-Sophie Puffet, Simon Rigoulot
{"title":"Validation of the Emotionally Congruent and Incongruent Face-Body Static Set (ECIFBSS).","authors":"Anne-Sophie Puffet, Simon Rigoulot","doi":"10.3758/s13428-024-02550-w","DOIUrl":null,"url":null,"abstract":"<p><p>Frequently, we perceive emotional information through multiple channels (e.g., face, voice, posture). These cues interact, facilitating emotional perception when congruent (similar across channels) compared to incongruent (different). Most previous studies on this congruency effect used stimuli from different sets, compromising their quality. In this context, we created and validated a new static stimulus set (ECIFBSS) featuring 1952 facial and body expressions of basic emotions in congruent and incongruent situations. We photographed 40 actors expressing facial emotions and body postures (anger, disgust, happiness, neutral, fear, surprise, and sadness) in both congruent and incongruent situations. The validation was conducted in two parts. In the first part, 76 participants performed a recognition task on facial and bodily expressions separately. In the second part, 40 participants performed the same recognition task, along with an evaluation of four features: intensity, authenticity, arousal, and valence. All emotions (face and body) were well recognized. Consistent with the literature, facial emotions were recognized better than body postures. Happiness was the most recognized facial emotion, while fear was the least. Among body expressions, anger had the highest recognition, while disgust was the least accurately recognized. Finally, facial and bodily expressions were considered moderately authentic, and the evaluation of intensity, valence, and arousal aligned with the dimensional model. The ECIFBSS offers static stimuli for studying facial and body expressions of basic emotions, providing a new tool to explore integrating emotional information from various channels and their reciprocal influence.</p>","PeriodicalId":8717,"journal":{"name":"Behavior Research Methods","volume":"57 1","pages":"41"},"PeriodicalIF":4.6000,"publicationDate":"2025-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Behavior Research Methods","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.3758/s13428-024-02550-w","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

Abstract

Frequently, we perceive emotional information through multiple channels (e.g., face, voice, posture). These cues interact, facilitating emotional perception when congruent (similar across channels) compared to incongruent (different). Most previous studies on this congruency effect used stimuli from different sets, compromising their quality. In this context, we created and validated a new static stimulus set (ECIFBSS) featuring 1952 facial and body expressions of basic emotions in congruent and incongruent situations. We photographed 40 actors expressing facial emotions and body postures (anger, disgust, happiness, neutral, fear, surprise, and sadness) in both congruent and incongruent situations. The validation was conducted in two parts. In the first part, 76 participants performed a recognition task on facial and bodily expressions separately. In the second part, 40 participants performed the same recognition task, along with an evaluation of four features: intensity, authenticity, arousal, and valence. All emotions (face and body) were well recognized. Consistent with the literature, facial emotions were recognized better than body postures. Happiness was the most recognized facial emotion, while fear was the least. Among body expressions, anger had the highest recognition, while disgust was the least accurately recognized. Finally, facial and bodily expressions were considered moderately authentic, and the evaluation of intensity, valence, and arousal aligned with the dimensional model. The ECIFBSS offers static stimuli for studying facial and body expressions of basic emotions, providing a new tool to explore integrating emotional information from various channels and their reciprocal influence.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
情绪一致与不一致脸-身体静态集的验证。
通常,我们通过多种渠道(例如,面部、声音、姿势)感知情绪信息。这些线索相互作用,促进情感感知时,一致(跨渠道相似)相比,不一致(不同)。以往大多数关于这种一致性效应的研究使用了来自不同集合的刺激,影响了它们的质量。在此背景下,我们创建并验证了一个新的静态刺激集(ECIFBSS),其中包含了在一致和不一致情况下1952种基本情绪的面部和身体表达。我们拍摄了40位演员在一致和不一致的情况下表达面部情绪和身体姿势(愤怒、厌恶、快乐、中性、恐惧、惊讶和悲伤)。验证分两部分进行。在第一部分中,76名参与者分别完成了面部和身体表情的识别任务。在第二部分中,40名参与者完成了同样的识别任务,并对四个特征进行了评估:强度、真实性、唤醒和效价。所有的情绪(面部和身体)都被很好地识别出来。与文献一致,面部情绪比身体姿势更容易被识别。快乐是最容易识别的面部表情,而恐惧是最不容易识别的。在身体表情中,愤怒的识别度最高,而厌恶的识别度最低。最后,面部和身体表情被认为是适度真实的,并且对强度、效价和唤醒的评估与维度模型一致。ECIFBSS为研究基本情绪的面部和身体表达提供了静态刺激,为探索整合各种渠道的情绪信息及其相互影响提供了新的工具。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
10.30
自引率
9.30%
发文量
266
期刊介绍: Behavior Research Methods publishes articles concerned with the methods, techniques, and instrumentation of research in experimental psychology. The journal focuses particularly on the use of computer technology in psychological research. An annual special issue is devoted to this field.
期刊最新文献
Distribution-free Bayesian analyses with the DFBA statistical package. Jiwar: A database and calculator for word neighborhood measures in 40 languages. Open-access network science: Investigating phonological similarity networks based on the SUBTLEX-US lexicon. Survey measures of metacognitive monitoring are often false. PREVIC: An adaptive parent report measure of expressive vocabulary in children between 3 and 8 years of age.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1