{"title":"Neutral-independent geometric features for facial expression recognition","authors":"Anwar Saeed, A. Al-Hamadi, R. Niese","doi":"10.1109/ISDA.2012.6416647","DOIUrl":null,"url":null,"abstract":"Improving Human-Computer Interaction (HCI) necessitates building an efficient human emotion recognition approach that involves various modalities such as facial expressions, hand gestures, acoustic data, and biophysiological data. In this paper, we address the perception of the universal human emotions (happy, surprise, anger, disgust, fear, and sadness) from facial expressions. In our companion-based assistant system, facial expression is considered as complementary aspect to the hand gestures. Unlike many other approaches, we do not rely on prior knowledge of the neutral state to infer the emotion because annotating the neutral state usually involves human intervention. We use features extracted from just eight fiducial facial points. Our results are in a good agreement with those of a state-of-the-art approach that exploits features derived from 68 facial points and requires prior knowledge of the neutral state. Then, we evaluate our approach on two databases. Finally, we investigate the influence of the facial points detection error on our emotion recognition approach.","PeriodicalId":370150,"journal":{"name":"2012 12th International Conference on Intelligent Systems Design and Applications (ISDA)","volume":"358 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 12th International Conference on Intelligent Systems Design and Applications (ISDA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISDA.2012.6416647","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
Improving Human-Computer Interaction (HCI) necessitates building an efficient human emotion recognition approach that involves various modalities such as facial expressions, hand gestures, acoustic data, and biophysiological data. In this paper, we address the perception of the universal human emotions (happy, surprise, anger, disgust, fear, and sadness) from facial expressions. In our companion-based assistant system, facial expression is considered as complementary aspect to the hand gestures. Unlike many other approaches, we do not rely on prior knowledge of the neutral state to infer the emotion because annotating the neutral state usually involves human intervention. We use features extracted from just eight fiducial facial points. Our results are in a good agreement with those of a state-of-the-art approach that exploits features derived from 68 facial points and requires prior knowledge of the neutral state. Then, we evaluate our approach on two databases. Finally, we investigate the influence of the facial points detection error on our emotion recognition approach.