Yong-Hwan Lee, Woori Han, Youngseop Kim, Bonam Kim
{"title":"Facial Feature Extraction Using an Active Appearance Model on the iPhone","authors":"Yong-Hwan Lee, Woori Han, Youngseop Kim, Bonam Kim","doi":"10.1109/IMIS.2014.24","DOIUrl":null,"url":null,"abstract":"Extracting and understanding human emotion plays an important role in the interaction between humans and machine communication systems. The most expressive way to display human emotion is through facial expression analysis. In this paper, we propose a novel extraction and recognition method for facial expression and emotion on mobile cameras and formulate a classification model for facial emotions using the variance of the estimated landmark points. Sixty five feature points are identified to extract the feature points from the input face and then the variance values of the point locations utilized to recognize facial emotions by comparing the results with a weighted fuzzy k-NN classification. Three types of facial emotion are recognized and classified: neutral, happy or angry. To evaluate the performance of the proposed algorithm, we assess the ratio of success using iPhone camera views. The experimental results show that the proposed method performs well in the recognition of facial emotion, and is sufficient to warrant its immediate application in mobile environments.","PeriodicalId":345694,"journal":{"name":"2014 Eighth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 Eighth International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IMIS.2014.24","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
Extracting and understanding human emotion plays an important role in the interaction between humans and machine communication systems. The most expressive way to display human emotion is through facial expression analysis. In this paper, we propose a novel extraction and recognition method for facial expression and emotion on mobile cameras and formulate a classification model for facial emotions using the variance of the estimated landmark points. Sixty five feature points are identified to extract the feature points from the input face and then the variance values of the point locations utilized to recognize facial emotions by comparing the results with a weighted fuzzy k-NN classification. Three types of facial emotion are recognized and classified: neutral, happy or angry. To evaluate the performance of the proposed algorithm, we assess the ratio of success using iPhone camera views. The experimental results show that the proposed method performs well in the recognition of facial emotion, and is sufficient to warrant its immediate application in mobile environments.