Nourhan Elfaramawy, Pablo V. A. Barros, G. I. Parisi, S. Wermter
{"title":"基于神经网络结构的身体表情情感识别","authors":"Nourhan Elfaramawy, Pablo V. A. Barros, G. I. Parisi, S. Wermter","doi":"10.1145/3125739.3125772","DOIUrl":null,"url":null,"abstract":"The recognition of emotions plays an important role in our daily life and is essential for social communication. Although multiple studies have shown that body expressions can strongly convey emotional states, emotion recognition from body motion patterns has received less attention than the use of facial expressions. In this paper, we propose a self-organizing neural architecture that can effectively recognize affective states from full-body motion patterns. To evaluate our system, we designed and collected a data corpus named the Body Expressions of Emotion (BEE) dataset using a depth sensor in a human-robot interaction scenario. For our recordings, nineteen participants were asked to perform six different emotions:anger, fear, happiness, neutral, sadness, and surprise. In order to compare our system with human-like performance, we conducted an additional experiment by asking fifteen annotators to label depth map video sequences as one of the six emotion classes. The labeling results from human annotators were compared to the results predicted by our system. Experimental results showed that the recognition accuracy of the system was competitive with human performance when exposed to body motion patterns from the same dataset.","PeriodicalId":346669,"journal":{"name":"Proceedings of the 5th International Conference on Human Agent Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":"{\"title\":\"Emotion Recognition from Body Expressions with a Neural Network Architecture\",\"authors\":\"Nourhan Elfaramawy, Pablo V. A. Barros, G. I. Parisi, S. Wermter\",\"doi\":\"10.1145/3125739.3125772\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The recognition of emotions plays an important role in our daily life and is essential for social communication. Although multiple studies have shown that body expressions can strongly convey emotional states, emotion recognition from body motion patterns has received less attention than the use of facial expressions. In this paper, we propose a self-organizing neural architecture that can effectively recognize affective states from full-body motion patterns. To evaluate our system, we designed and collected a data corpus named the Body Expressions of Emotion (BEE) dataset using a depth sensor in a human-robot interaction scenario. For our recordings, nineteen participants were asked to perform six different emotions:anger, fear, happiness, neutral, sadness, and surprise. In order to compare our system with human-like performance, we conducted an additional experiment by asking fifteen annotators to label depth map video sequences as one of the six emotion classes. The labeling results from human annotators were compared to the results predicted by our system. Experimental results showed that the recognition accuracy of the system was competitive with human performance when exposed to body motion patterns from the same dataset.\",\"PeriodicalId\":346669,\"journal\":{\"name\":\"Proceedings of the 5th International Conference on Human Agent Interaction\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-10-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"22\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 5th International Conference on Human Agent Interaction\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3125739.3125772\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th International Conference on Human Agent Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3125739.3125772","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Emotion Recognition from Body Expressions with a Neural Network Architecture
The recognition of emotions plays an important role in our daily life and is essential for social communication. Although multiple studies have shown that body expressions can strongly convey emotional states, emotion recognition from body motion patterns has received less attention than the use of facial expressions. In this paper, we propose a self-organizing neural architecture that can effectively recognize affective states from full-body motion patterns. To evaluate our system, we designed and collected a data corpus named the Body Expressions of Emotion (BEE) dataset using a depth sensor in a human-robot interaction scenario. For our recordings, nineteen participants were asked to perform six different emotions:anger, fear, happiness, neutral, sadness, and surprise. In order to compare our system with human-like performance, we conducted an additional experiment by asking fifteen annotators to label depth map video sequences as one of the six emotion classes. The labeling results from human annotators were compared to the results predicted by our system. Experimental results showed that the recognition accuracy of the system was competitive with human performance when exposed to body motion patterns from the same dataset.