{"title":"Facial expressions recognition for arabic sign language translation","authors":"A. S. Elons, Menna Ahmed, Hwaidaa Shedid","doi":"10.1109/ICCES.2014.7030980","DOIUrl":null,"url":null,"abstract":"Contrary to the common sense that tells us sign language depends mainly on hands, other factors such as facial expressions, body movements and lips affect dramatically a sign meaning. Arabic Sign Language (ArSL) tends to be a descriptive gesture language, facial expressions are involved in 70% of total signs. In this paper, a study on an ArSL database is performed to conclude that the 6 main facial expressions are essential to recognize the sign. A developed system used to classify these expressions accomplished 92% recognition rate on 5 different people. The system employed already existing technical methods such as: Recursive Principle Components (RPCA) for feature extraction and Multi-layer Perceptron (MLP) for classification. The main contribution of this paper is employing the developed module and integrating it with an already existing hand sign recognition system. The proposed system enhanced the hand sign recognition system and raised the recognition rate from 88% to 98%. Various people's shapes and capturing angles and distances have also been taken into consideration.","PeriodicalId":339697,"journal":{"name":"2014 9th International Conference on Computer Engineering & Systems (ICCES)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 9th International Conference on Computer Engineering & Systems (ICCES)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCES.2014.7030980","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Contrary to the common sense that tells us sign language depends mainly on hands, other factors such as facial expressions, body movements and lips affect dramatically a sign meaning. Arabic Sign Language (ArSL) tends to be a descriptive gesture language, facial expressions are involved in 70% of total signs. In this paper, a study on an ArSL database is performed to conclude that the 6 main facial expressions are essential to recognize the sign. A developed system used to classify these expressions accomplished 92% recognition rate on 5 different people. The system employed already existing technical methods such as: Recursive Principle Components (RPCA) for feature extraction and Multi-layer Perceptron (MLP) for classification. The main contribution of this paper is employing the developed module and integrating it with an already existing hand sign recognition system. The proposed system enhanced the hand sign recognition system and raised the recognition rate from 88% to 98%. Various people's shapes and capturing angles and distances have also been taken into consideration.