{"title":"Multiscale Facial Expression Recognition Based on Dynamic Global and Static Local Attention","authors":"Jie Xu;Yang Li;Guanci Yang;Ling He;Kexin Luo","doi":"10.1109/TAFFC.2024.3458464","DOIUrl":null,"url":null,"abstract":"To better characterize the differences in category features in Facial Expression Recognition (FER) tasks, and improve inter-class separability and intra-class compactness, we propose a Multiscale Facial Expression Recognition model based on dynamic global and static local attention (MFER) from the perspectives of intra-class and inter-class features. Firstly, we propose Dynamic global and Static local attention (DS Attention) mechanism that fuse contextual information, learn potential regions of global and local features between different expression categories, and represent feature discrepancies between categories to distinguish between different expression categories. Then, we design a Deep Smooth Feature loss function (DSF) to balance the probability difference of encoded intra-class features and promote intra-class features towards corresponding centers. Finally, we construct a Multiscale classifier method (Msc) to learn high-frequency and low-frequency information in the dimensional space, represent deep features of multiscale dimensional space, and alleviate sparse distribution problems in high-dimensional space. Experimental results on public datasets RAF-DB, AffectNet-7, AffectNet-8, and FERPlus show that the proposed model achieves state-of-the-art performance with recognition accuracies of 92.08%, 67.06%, 63.15%, and 91.09%, respectively.","PeriodicalId":13131,"journal":{"name":"IEEE Transactions on Affective Computing","volume":"16 2","pages":"683-696"},"PeriodicalIF":9.8000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Affective Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10678884/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
To better characterize the differences in category features in Facial Expression Recognition (FER) tasks, and improve inter-class separability and intra-class compactness, we propose a Multiscale Facial Expression Recognition model based on dynamic global and static local attention (MFER) from the perspectives of intra-class and inter-class features. Firstly, we propose Dynamic global and Static local attention (DS Attention) mechanism that fuse contextual information, learn potential regions of global and local features between different expression categories, and represent feature discrepancies between categories to distinguish between different expression categories. Then, we design a Deep Smooth Feature loss function (DSF) to balance the probability difference of encoded intra-class features and promote intra-class features towards corresponding centers. Finally, we construct a Multiscale classifier method (Msc) to learn high-frequency and low-frequency information in the dimensional space, represent deep features of multiscale dimensional space, and alleviate sparse distribution problems in high-dimensional space. Experimental results on public datasets RAF-DB, AffectNet-7, AffectNet-8, and FERPlus show that the proposed model achieves state-of-the-art performance with recognition accuracies of 92.08%, 67.06%, 63.15%, and 91.09%, respectively.
期刊介绍:
The IEEE Transactions on Affective Computing is an international and interdisciplinary journal. Its primary goal is to share research findings on the development of systems capable of recognizing, interpreting, and simulating human emotions and related affective phenomena. The journal publishes original research on the underlying principles and theories that explain how and why affective factors shape human-technology interactions. It also focuses on how techniques for sensing and simulating affect can enhance our understanding of human emotions and processes. Additionally, the journal explores the design, implementation, and evaluation of systems that prioritize the consideration of affect in their usability. We also welcome surveys of existing work that provide new perspectives on the historical and future directions of this field.