Haifa F. Alhasson, Ghada M. Alsaheel, Noura S. Alharbi, Alhatoon A. Alsalamah, Joud M. Alhujilan, Shuaa S. Alharbi
{"title":"Bi-Model Engagement Emotion Recognition Based on Facial and Upper-Body Landmarks and Machine Learning Approaches","authors":"Haifa F. Alhasson, Ghada M. Alsaheel, Noura S. Alharbi, Alhatoon A. Alsalamah, Joud M. Alhujilan, Shuaa S. Alharbi","doi":"10.4018/ijesma.330756","DOIUrl":null,"url":null,"abstract":"Customer satisfaction can be measured using facial expression recognition. The current generation of artificial intelligence systems heavily depends on facial features such as eyebrows, eyes, and foreheads. This dependence introduces a limitation as people generally prefer to conceal their genuine emotions. As body gestures are difficult to conceal and can convey a more detailed and accurate emotional state, the authors incorporate upper-body gestures as an additional feature that improves the predicted emotion's accuracy. This work uses an ensemble machine-learning model that integrates support vector machines, random forest classifiers, and logistic regression classifiers. The proposed method detects emotions from facial expressions and upper-body movements and is experimentally evaluated and has been found to be effective, with an accuracy rate of 97% on the EMOTIC dataset and 99% accuracy on MELD dataset.","PeriodicalId":37399,"journal":{"name":"International Journal of E-Services and Mobile Applications","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of E-Services and Mobile Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/ijesma.330756","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Business, Management and Accounting","Score":null,"Total":0}
引用次数: 0
Abstract
Customer satisfaction can be measured using facial expression recognition. The current generation of artificial intelligence systems heavily depends on facial features such as eyebrows, eyes, and foreheads. This dependence introduces a limitation as people generally prefer to conceal their genuine emotions. As body gestures are difficult to conceal and can convey a more detailed and accurate emotional state, the authors incorporate upper-body gestures as an additional feature that improves the predicted emotion's accuracy. This work uses an ensemble machine-learning model that integrates support vector machines, random forest classifiers, and logistic regression classifiers. The proposed method detects emotions from facial expressions and upper-body movements and is experimentally evaluated and has been found to be effective, with an accuracy rate of 97% on the EMOTIC dataset and 99% accuracy on MELD dataset.
期刊介绍:
The International Journal of E-Services and Mobile Applications (IJESMA) promotes and publishes state-of-the art research regarding different issues in the production management, delivery and consumption of e-services, self services, and mobile communication including business-to-business, business-to-consumer, government-to-business, government-to-consumer, and consumer-to-consumer e-services relevant to the interest of professionals, academic educators, researchers, and industry consultants in the field.