Jingting Li, Moi Hoon Yap, Wen-Huang Cheng, John See, Xiaopeng Hong, Xiabai Li, Su-Jing Wang
{"title":"FME '22: 2nd Workshop on Facial Micro-Expression: Advanced Techniques for Multi-Modal Facial Expression Analysis","authors":"Jingting Li, Moi Hoon Yap, Wen-Huang Cheng, John See, Xiaopeng Hong, Xiabai Li, Su-Jing Wang","doi":"10.1145/3503161.3554777","DOIUrl":null,"url":null,"abstract":"Micro-expressions are facial movements that are extremely short and not easily detected, which often reflect the genuine emotions of individuals. Micro-expressions are important cues for understanding real human emotions and can be used for non-contact non-perceptual deception detection, or abnormal emotion recognition. It has broad application prospects in national security, judicial practice, health prevention, clinical practice, etc. However, micro-expression feature extraction and learning are highly challenging because micro-expressions have the characteristics of short duration, low intensity, and local asymmetry. In addition, the intelligent micro-expression analysis combined with deep learning technology is also plagued by the problem of small samples. Not only is micro-expression elicitation very difficult, micro-expression annotation is also very time-consuming and laborious. More importantly, the micro-expression generation mechanism is not yet clear, which shackles the application of micro-expressions in real scenarios. FME'22 is the inaugural workshop in this area of research, with the aim of promoting interactions between researchers and scholars from within this niche area of research and also including those from broader, general areas of expression and psychology research. The complete FME'22 workshop proceedings are available at: https://dl.acm.org/doi/proceedings/10.1145/3552465.","PeriodicalId":412792,"journal":{"name":"Proceedings of the 30th ACM International Conference on Multimedia","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 30th ACM International Conference on Multimedia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3503161.3554777","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Micro-expressions are facial movements that are extremely short and not easily detected, which often reflect the genuine emotions of individuals. Micro-expressions are important cues for understanding real human emotions and can be used for non-contact non-perceptual deception detection, or abnormal emotion recognition. It has broad application prospects in national security, judicial practice, health prevention, clinical practice, etc. However, micro-expression feature extraction and learning are highly challenging because micro-expressions have the characteristics of short duration, low intensity, and local asymmetry. In addition, the intelligent micro-expression analysis combined with deep learning technology is also plagued by the problem of small samples. Not only is micro-expression elicitation very difficult, micro-expression annotation is also very time-consuming and laborious. More importantly, the micro-expression generation mechanism is not yet clear, which shackles the application of micro-expressions in real scenarios. FME'22 is the inaugural workshop in this area of research, with the aim of promoting interactions between researchers and scholars from within this niche area of research and also including those from broader, general areas of expression and psychology research. The complete FME'22 workshop proceedings are available at: https://dl.acm.org/doi/proceedings/10.1145/3552465.