{"title":"Towards Formal Multimodal Analysis of Emotions for Affective Computing","authors":"M. Ghayoumi, Maha Thafar, A. Bansal","doi":"10.18293/DMS2016-030","DOIUrl":null,"url":null,"abstract":"Social robotics is related to the robotic systems and human interaction. Social robots have applications in elderly care, health care, home care, customer service and reception in industrial settings. Human-Robot Interaction (HRI) requires better understanding of human emotion. There are few multimodal fusion systems that integrate limited amount of facial expression, speech and gesture analysis. In this paper, we describe the implementation of a semantic algebra based formal model that integrates six basic facial expressions, speech phrases and gesture trajectories. The system is capable of real-time interaction. We used the decision level fusion approach for integration and the prototype system has been implemented using Matlab. KeywordsAffective computing, Emotion recognition, Humanmachine interaction, Multimedia, Multimodal, Decision level fusion, Social robotics.","PeriodicalId":297195,"journal":{"name":"J. Vis. Lang. Sentient Syst.","volume":"65 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-11-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Vis. Lang. Sentient Syst.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18293/DMS2016-030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11
Abstract
Social robotics is related to the robotic systems and human interaction. Social robots have applications in elderly care, health care, home care, customer service and reception in industrial settings. Human-Robot Interaction (HRI) requires better understanding of human emotion. There are few multimodal fusion systems that integrate limited amount of facial expression, speech and gesture analysis. In this paper, we describe the implementation of a semantic algebra based formal model that integrates six basic facial expressions, speech phrases and gesture trajectories. The system is capable of real-time interaction. We used the decision level fusion approach for integration and the prototype system has been implemented using Matlab. KeywordsAffective computing, Emotion recognition, Humanmachine interaction, Multimedia, Multimodal, Decision level fusion, Social robotics.