Detection of emotional faces: The role of spatial frequencies and local features

IF 1.5 4区 心理学 Q4 NEUROSCIENCES Vision Research Pub Date : 2023-10-01 DOI:10.1016/j.visres.2023.108281
Léa Entzmann , Nathalie Guyader , Louise Kauffmann , Carole Peyrin , Martial Mermillod
{"title":"Detection of emotional faces: The role of spatial frequencies and local features","authors":"Léa Entzmann ,&nbsp;Nathalie Guyader ,&nbsp;Louise Kauffmann ,&nbsp;Carole Peyrin ,&nbsp;Martial Mermillod","doi":"10.1016/j.visres.2023.108281","DOIUrl":null,"url":null,"abstract":"<div><p><span>Models of emotion processing suggest that threat-related stimuli such as fearful faces can be detected based on the rapid extraction of low spatial frequencies. However, this remains debated as other models argue that the decoding of facial expressions occurs with a more flexible use of spatial frequencies. The purpose of this study was to clarify the role of spatial frequencies and differences in luminance contrast between spatial frequencies, on the detection of facial emotions. We used a saccadic choice task in which emotional-neutral face pairs were presented and participants were asked to make a saccade toward the neutral or the emotional (happy or fearful) face. Faces were displayed either in low, high, or broad spatial frequencies. Results showed that participants were better to saccade toward the emotional face. They were also better for high or broad than low spatial frequencies, and the accuracy was higher with a happy target. An analysis of the eye and mouth saliency of</span> <!-->our stimuli revealed that the mouth saliency of the target correlates with participants’ performance. Overall, this study underlines the importance of local more than global information, and of the saliency of the mouth region in the detection of emotional and neutral faces.</p></div>","PeriodicalId":23670,"journal":{"name":"Vision Research","volume":null,"pages":null},"PeriodicalIF":1.5000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Vision Research","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0042698923001050","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Models of emotion processing suggest that threat-related stimuli such as fearful faces can be detected based on the rapid extraction of low spatial frequencies. However, this remains debated as other models argue that the decoding of facial expressions occurs with a more flexible use of spatial frequencies. The purpose of this study was to clarify the role of spatial frequencies and differences in luminance contrast between spatial frequencies, on the detection of facial emotions. We used a saccadic choice task in which emotional-neutral face pairs were presented and participants were asked to make a saccade toward the neutral or the emotional (happy or fearful) face. Faces were displayed either in low, high, or broad spatial frequencies. Results showed that participants were better to saccade toward the emotional face. They were also better for high or broad than low spatial frequencies, and the accuracy was higher with a happy target. An analysis of the eye and mouth saliency of our stimuli revealed that the mouth saliency of the target correlates with participants’ performance. Overall, this study underlines the importance of local more than global information, and of the saliency of the mouth region in the detection of emotional and neutral faces.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
情绪面孔的检测:空间频率和局部特征的作用
情绪处理模型表明,基于对低空间频率的快速提取,可以检测到与威胁相关的刺激,如恐惧的面孔。然而,这一点仍然存在争议,因为其他模型认为,面部表情的解码是通过更灵活地使用空间频率来实现的。本研究的目的是阐明空间频率和空间频率之间亮度对比度的差异在面部情绪检测中的作用。我们使用了一个扫视选择任务,其中呈现了情绪中性的面孔对,并要求参与者向中性或情绪(快乐或恐惧)的面孔扫视。人脸以低、高或宽的空间频率显示。结果显示,参与者更倾向于向情绪化的面孔扫视。它们在高或宽空间频率下也比在低空间频率下更好,并且在快乐的目标下精度更高。对我们刺激的眼睛和嘴巴显著性的分析表明,目标的嘴巴显著性与参与者的表现相关。总的来说,这项研究强调了局部信息而非全局信息的重要性,以及口腔区域在检测情绪和中性面孔方面的显著性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Vision Research
Vision Research 医学-神经科学
CiteScore
3.70
自引率
16.70%
发文量
111
审稿时长
66 days
期刊介绍: Vision Research is a journal devoted to the functional aspects of human, vertebrate and invertebrate vision and publishes experimental and observational studies, reviews, and theoretical and computational analyses. Vision Research also publishes clinical studies relevant to normal visual function and basic research relevant to visual dysfunction or its clinical investigation. Functional aspects of vision is interpreted broadly, ranging from molecular and cellular function to perception and behavior. Detailed descriptions are encouraged but enough introductory background should be included for non-specialists. Theoretical and computational papers should give a sense of order to the facts or point to new verifiable observations. Papers dealing with questions in the history of vision science should stress the development of ideas in the field.
期刊最新文献
Dynamics of the perceptive field size in human adults Resting trabecular meshwork cells experience constitutive cation influx Optical phase nullification partially restores visual and stereo acuity lost to simulated blur from higher-order wavefront aberrations of keratoconic eyes Two different visual stimuli that cause axial eye shortening have no additive effect Scene context and attention independently facilitate MEG decoding of object category
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1