Facial expression categorization predominantly relies on mid-spatial frequencies

IF 1.4 4区 心理学 Q4 NEUROSCIENCES Vision Research Pub Date : 2025-06-01 Epub Date: 2025-04-27 DOI:10.1016/j.visres.2025.108611
Isabelle Charbonneau , Justin Duncan , Caroline Blais , Joël Guérette , Marie-Pier Plouffe-Demers , Fraser Smith , Daniel Fiset
{"title":"Facial expression categorization predominantly relies on mid-spatial frequencies","authors":"Isabelle Charbonneau ,&nbsp;Justin Duncan ,&nbsp;Caroline Blais ,&nbsp;Joël Guérette ,&nbsp;Marie-Pier Plouffe-Demers ,&nbsp;Fraser Smith ,&nbsp;Daniel Fiset","doi":"10.1016/j.visres.2025.108611","DOIUrl":null,"url":null,"abstract":"<div><div>Facial expressions are crucial in human communication. Recent decades have seen growing interest in understanding the role of spatial frequencies (SFs) in emotion perception in others. While some studies have suggested a preferential treatment of low versus high SFs, the optimal SFs for recognizing basic facial expressions remain elusive. This study, conducted on Western participants, addresses this gap using two complementary methods: a data-driven method (Exp. 1) without arbitrary SF cut-offs, and a more naturalistic method (Exp. 2) simulating variations in viewing distance. Results generally showed a preponderant role of low over high SFs, but particularly stress that facial expression categorization mostly relies on mid-range SF content (i.e. ∼6–13 cycles per face), often overlooked in previous studies. Optimal performance was observed at short to medium viewing distances (1.2–2.4 m), declining sharply with increased distance, precisely when mid-range SFs were no longer available. Additionally, our data suggest variations in SF tuning profiles across basic facial expressions and nuanced contributions from low and mid SFs in facial expression processing. Most importantly, it suggests that any method that removes mid-SF content has the downfall of offering an incomplete account of SFs diagnosticity for facial expression recognition.</div></div>","PeriodicalId":23670,"journal":{"name":"Vision Research","volume":"231 ","pages":"Article 108611"},"PeriodicalIF":1.4000,"publicationDate":"2025-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Vision Research","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0042698925000720","RegionNum":4,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/4/27 0:00:00","PubModel":"Epub","JCR":"Q4","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
引用次数: 0

Abstract

Facial expressions are crucial in human communication. Recent decades have seen growing interest in understanding the role of spatial frequencies (SFs) in emotion perception in others. While some studies have suggested a preferential treatment of low versus high SFs, the optimal SFs for recognizing basic facial expressions remain elusive. This study, conducted on Western participants, addresses this gap using two complementary methods: a data-driven method (Exp. 1) without arbitrary SF cut-offs, and a more naturalistic method (Exp. 2) simulating variations in viewing distance. Results generally showed a preponderant role of low over high SFs, but particularly stress that facial expression categorization mostly relies on mid-range SF content (i.e. ∼6–13 cycles per face), often overlooked in previous studies. Optimal performance was observed at short to medium viewing distances (1.2–2.4 m), declining sharply with increased distance, precisely when mid-range SFs were no longer available. Additionally, our data suggest variations in SF tuning profiles across basic facial expressions and nuanced contributions from low and mid SFs in facial expression processing. Most importantly, it suggests that any method that removes mid-SF content has the downfall of offering an incomplete account of SFs diagnosticity for facial expression recognition.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
面部表情分类主要依赖于中空间频率
面部表情在人类交流中是至关重要的。近几十年来,人们对理解空间频率(sf)在他人情绪感知中的作用越来越感兴趣。虽然一些研究表明,对低和高的SFs有优先处理,但识别基本面部表情的最佳SFs仍然难以捉摸。本研究以西方参与者为对象,使用两种互补的方法解决了这一差距:一种是数据驱动的方法(实验1),没有任意的SF截止,另一种是更自然的方法(实验2),模拟观看距离的变化。结果普遍显示低SF比高SF更重要,但特别强调面部表情分类主要依赖于中档SF含量(即每张脸6-13个周期),这在以前的研究中经常被忽视。在中短距离(1.2-2.4 m)观察到最佳性能,随着距离的增加而急剧下降,正是在中距离SFs不再可用时。此外,我们的数据还表明,在基本面部表情中,SF调节谱存在差异,在面部表情处理中,中低SF的细微贡献也存在差异。最重要的是,它表明,任何去除中间sf内容的方法都有一个缺点,即对面部表情识别的sf诊断性提供不完整的说明。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Vision Research
Vision Research 医学-神经科学
CiteScore
3.70
自引率
16.70%
发文量
111
审稿时长
66 days
期刊介绍: Vision Research is a journal devoted to the functional aspects of human, vertebrate and invertebrate vision and publishes experimental and observational studies, reviews, and theoretical and computational analyses. Vision Research also publishes clinical studies relevant to normal visual function and basic research relevant to visual dysfunction or its clinical investigation. Functional aspects of vision is interpreted broadly, ranging from molecular and cellular function to perception and behavior. Detailed descriptions are encouraged but enough introductory background should be included for non-specialists. Theoretical and computational papers should give a sense of order to the facts or point to new verifiable observations. Papers dealing with questions in the history of vision science should stress the development of ideas in the field.
期刊最新文献
The eyes are central to face detection: revisiting the foundations of face processing Investigating ethnicity-related variability in the human L-cone spectral sensitivity function Filter, Detector, Predictor: The expanding repertoire of retinal computation in vertebrates Filling-in of the foveal rod scotoma and confidence preference for central vision under mesopic viewing conditions Color induction in anomalous color vision
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1