Decoding face identity: A reverse-correlation approach using deep learning

IF 2.8 1区 心理学 Q1 PSYCHOLOGY, EXPERIMENTAL Cognition Pub Date : 2024-11-16 DOI:10.1016/j.cognition.2024.106008
Xue Tian , Yiying Song , Jia Liu
{"title":"Decoding face identity: A reverse-correlation approach using deep learning","authors":"Xue Tian ,&nbsp;Yiying Song ,&nbsp;Jia Liu","doi":"10.1016/j.cognition.2024.106008","DOIUrl":null,"url":null,"abstract":"<div><div>Face recognition is crucial for social interactions. Traditional approaches primarily rely on subjective judgment, utilizing a pre-selected set of facial features based on literature or intuition to identify critical facial features for face recognition. In this study, we adopted a reverse-correlation approach, aligning responses of a deep convolutional neural network (DCNN) with its internal representations to objectively identify facial features pivotal for face recognition. Specifically, we trained a DCNN, namely VGG-FD, to possess human-like capability in discriminating facial identities. A representational similarity analysis (RSA) was employed to characterize VGG-FD's performance metrics, which was subsequently reverse-correlated with its representations in layers capable of discriminating facial identities. Our analysis revealed a higher likelihood of face pairs being perceived as different identities when their representations significantly differed in areas such as the eyes, eyebrows, or central facial region, suggesting the significance of the eyes as facial parts and the central facial region as an integral of face configuration in face recognition. In summary, our study leveraged DCNNs to identify critical facial features for face discrimination in a hypothesis-neutral, data-driven manner, hereby advocating for the adoption of this new paradigm to explore critical facial features across various face recognition tasks.</div></div>","PeriodicalId":48455,"journal":{"name":"Cognition","volume":"254 ","pages":"Article 106008"},"PeriodicalIF":2.8000,"publicationDate":"2024-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognition","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0010027724002944","RegionNum":1,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"PSYCHOLOGY, EXPERIMENTAL","Score":null,"Total":0}
引用次数: 0

Abstract

Face recognition is crucial for social interactions. Traditional approaches primarily rely on subjective judgment, utilizing a pre-selected set of facial features based on literature or intuition to identify critical facial features for face recognition. In this study, we adopted a reverse-correlation approach, aligning responses of a deep convolutional neural network (DCNN) with its internal representations to objectively identify facial features pivotal for face recognition. Specifically, we trained a DCNN, namely VGG-FD, to possess human-like capability in discriminating facial identities. A representational similarity analysis (RSA) was employed to characterize VGG-FD's performance metrics, which was subsequently reverse-correlated with its representations in layers capable of discriminating facial identities. Our analysis revealed a higher likelihood of face pairs being perceived as different identities when their representations significantly differed in areas such as the eyes, eyebrows, or central facial region, suggesting the significance of the eyes as facial parts and the central facial region as an integral of face configuration in face recognition. In summary, our study leveraged DCNNs to identify critical facial features for face discrimination in a hypothesis-neutral, data-driven manner, hereby advocating for the adoption of this new paradigm to explore critical facial features across various face recognition tasks.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
解码人脸身份:利用深度学习的反向相关方法。
人脸识别对于社会交往至关重要。传统方法主要依赖于主观判断,利用根据文献或直觉预先选择的一组面部特征来识别人脸识别的关键面部特征。在本研究中,我们采用了一种反向相关方法,将深度卷积神经网络(DCNN)的响应与其内部表征相一致,从而客观地识别出人脸识别中至关重要的面部特征。具体来说,我们训练了一个 DCNN,即 VGG-FD,使其在辨别人脸身份方面具有与人类类似的能力。我们采用了表征相似性分析(RSA)来描述 VGG-FD 的性能指标,随后将其与能够辨别人脸身份的层中的表征进行反向关联。我们的分析表明,如果人脸对在眼睛、眉毛或面部中央区域等区域的表征存在显著差异,那么人脸对被视为不同身份的可能性就会更高,这表明眼睛作为面部的一部分以及面部中央区域作为人脸识别中面部配置的一个组成部分具有重要意义。总之,我们的研究利用 DCNNs 以假设中立、数据驱动的方式识别了人脸识别的关键面部特征,从而倡导采用这种新范式来探索各种人脸识别任务中的关键面部特征。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Cognition
Cognition PSYCHOLOGY, EXPERIMENTAL-
CiteScore
6.40
自引率
5.90%
发文量
283
期刊介绍: Cognition is an international journal that publishes theoretical and experimental papers on the study of the mind. It covers a wide variety of subjects concerning all the different aspects of cognition, ranging from biological and experimental studies to formal analysis. Contributions from the fields of psychology, neuroscience, linguistics, computer science, mathematics, ethology and philosophy are welcome in this journal provided that they have some bearing on the functioning of the mind. In addition, the journal serves as a forum for discussion of social and political aspects of cognitive science.
期刊最新文献
Morality on the road: Should machine drivers be more utilitarian than human drivers? Relative source credibility affects the continued influence effect: Evidence of rationality in the CIE. Decoding face identity: A reverse-correlation approach using deep learning How does color distribution learning affect goal-directed visuomotor behavior? Bias-free measure of distractor avoidance in visual search
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1