{"title":"基于径向基函数的人体运动情感识别网络结构","authors":"Mark Rosenblum, Y. Yacoob, Larry Davis","doi":"10.1109/MNRAO.1994.346256","DOIUrl":null,"url":null,"abstract":"A radial basis function network architecture is developed that learns the correlation of facial feature motion patterns and human emotions. We describe a hierarchical approach which at the highest level identifies emotions, at the mid level determines motion of facial features, and at the low level recovers motion directions. Individual emotion networks were trained to recognize the 'smile' and 'surprise' emotions. Each emotion network was trained by viewing a set of sequences of one emotion for many subjects. The trained neural network was then tested for retention, extrapolation and rejection ability. Success rates were about 88% for retention, 73% for extrapolation, and 79% for rejection.<<ETX>>","PeriodicalId":336218,"journal":{"name":"Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"117","resultStr":"{\"title\":\"Human emotion recognition from motion using a radial basis function network architecture\",\"authors\":\"Mark Rosenblum, Y. Yacoob, Larry Davis\",\"doi\":\"10.1109/MNRAO.1994.346256\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A radial basis function network architecture is developed that learns the correlation of facial feature motion patterns and human emotions. We describe a hierarchical approach which at the highest level identifies emotions, at the mid level determines motion of facial features, and at the low level recovers motion directions. Individual emotion networks were trained to recognize the 'smile' and 'surprise' emotions. Each emotion network was trained by viewing a set of sequences of one emotion for many subjects. The trained neural network was then tested for retention, extrapolation and rejection ability. Success rates were about 88% for retention, 73% for extrapolation, and 79% for rejection.<<ETX>>\",\"PeriodicalId\":336218,\"journal\":{\"name\":\"Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-11-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"117\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MNRAO.1994.346256\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MNRAO.1994.346256","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Human emotion recognition from motion using a radial basis function network architecture
A radial basis function network architecture is developed that learns the correlation of facial feature motion patterns and human emotions. We describe a hierarchical approach which at the highest level identifies emotions, at the mid level determines motion of facial features, and at the low level recovers motion directions. Individual emotion networks were trained to recognize the 'smile' and 'surprise' emotions. Each emotion network was trained by viewing a set of sequences of one emotion for many subjects. The trained neural network was then tested for retention, extrapolation and rejection ability. Success rates were about 88% for retention, 73% for extrapolation, and 79% for rejection.<>