{"title":"基于DCNN的面部表情识别及ASD儿童iOS应用的开发","authors":"Md Inzamam Ul Haque, Damian Valles","doi":"10.1109/UEMCON47517.2019.8993051","DOIUrl":null,"url":null,"abstract":"In this paper, continued work of a research project is discussed which achieved the end goal of the project - to build a mobile device application that can teach children with Autism Spectrum Disorder (ASD) to recognize human facial expressions utilizing computer vision and image processing. Universally, there are seven facial expressions categories: angry, disgust, happy, sad, fear, surprise, and neutral. To recognize all these facial expressions and to predict the current mood of a person is a difficult task for a child. A child with ASD, this problem presents itself in a more sophisticated manner due to the nature of the disorder. The main goal of this research was to develop a deep Convolutional Neural Network (DCNN) for facial expression recognition, which can help young children with ASD to recognize facial expressions, using mobile devices. The Kaggle's FER2013 and Karolinska Directed Emotional Faces (KDEF) dataset have been used to train and test with the DCNN model, which can classify facial expressions from different viewpoints and in different lighting contrasts. An 86.44% accuracy was achieved with good generalizability for the DCNN model. The results show an improvement of the DCNN accuracy in dealing with lighting contrast changes, and the implementation of image processing before performing the facial expression classification. As a byproduct of this research project, an app suitable for the iOS platform was developed for running both the DCNN model and image processing algorithm. The app can be used by speech-language pathologies, teacher, care-takers, and parents as a technological tool when working with children with ASD.","PeriodicalId":187022,"journal":{"name":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Facial Expression Recognition Using DCNN and Development of an iOS App for Children with ASD to Enhance Communication Abilities\",\"authors\":\"Md Inzamam Ul Haque, Damian Valles\",\"doi\":\"10.1109/UEMCON47517.2019.8993051\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, continued work of a research project is discussed which achieved the end goal of the project - to build a mobile device application that can teach children with Autism Spectrum Disorder (ASD) to recognize human facial expressions utilizing computer vision and image processing. Universally, there are seven facial expressions categories: angry, disgust, happy, sad, fear, surprise, and neutral. To recognize all these facial expressions and to predict the current mood of a person is a difficult task for a child. A child with ASD, this problem presents itself in a more sophisticated manner due to the nature of the disorder. The main goal of this research was to develop a deep Convolutional Neural Network (DCNN) for facial expression recognition, which can help young children with ASD to recognize facial expressions, using mobile devices. The Kaggle's FER2013 and Karolinska Directed Emotional Faces (KDEF) dataset have been used to train and test with the DCNN model, which can classify facial expressions from different viewpoints and in different lighting contrasts. An 86.44% accuracy was achieved with good generalizability for the DCNN model. The results show an improvement of the DCNN accuracy in dealing with lighting contrast changes, and the implementation of image processing before performing the facial expression classification. As a byproduct of this research project, an app suitable for the iOS platform was developed for running both the DCNN model and image processing algorithm. The app can be used by speech-language pathologies, teacher, care-takers, and parents as a technological tool when working with children with ASD.\",\"PeriodicalId\":187022,\"journal\":{\"name\":\"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/UEMCON47517.2019.8993051\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 10th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UEMCON47517.2019.8993051","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Facial Expression Recognition Using DCNN and Development of an iOS App for Children with ASD to Enhance Communication Abilities
In this paper, continued work of a research project is discussed which achieved the end goal of the project - to build a mobile device application that can teach children with Autism Spectrum Disorder (ASD) to recognize human facial expressions utilizing computer vision and image processing. Universally, there are seven facial expressions categories: angry, disgust, happy, sad, fear, surprise, and neutral. To recognize all these facial expressions and to predict the current mood of a person is a difficult task for a child. A child with ASD, this problem presents itself in a more sophisticated manner due to the nature of the disorder. The main goal of this research was to develop a deep Convolutional Neural Network (DCNN) for facial expression recognition, which can help young children with ASD to recognize facial expressions, using mobile devices. The Kaggle's FER2013 and Karolinska Directed Emotional Faces (KDEF) dataset have been used to train and test with the DCNN model, which can classify facial expressions from different viewpoints and in different lighting contrasts. An 86.44% accuracy was achieved with good generalizability for the DCNN model. The results show an improvement of the DCNN accuracy in dealing with lighting contrast changes, and the implementation of image processing before performing the facial expression classification. As a byproduct of this research project, an app suitable for the iOS platform was developed for running both the DCNN model and image processing algorithm. The app can be used by speech-language pathologies, teacher, care-takers, and parents as a technological tool when working with children with ASD.