Huawei Zhang, Tsuyoshi Takahashi, Y. Kageyama, M. Nishida
{"title":"Emotion Discrimination of Amusement Based on Three-Dimensional Data of Body Movements","authors":"Huawei Zhang, Tsuyoshi Takahashi, Y. Kageyama, M. Nishida","doi":"10.5188/ijsmer.23.189","DOIUrl":null,"url":null,"abstract":"An important problem associated with the progression of aging in society is the improvement of quality of life (QOL). If we can maintain or improve the QOL for individuals, including the elderly, this will contribute not only to the revitalization and stability of society [1], but also to the reduction of social costs such as nursing care and medical expenses [2]. Therefore, a living support system, such as an acceptable living environment and health monitoring system, is required. There are currently no imaging systems capable of automatically and quantitatively evaluating the factors that improve individual QOL and the intensity thereof in real time. Human QOL and \"laughs/smiles\" are closely related. Individuals who laugh every day exhibit better health and improved psychological state. In addition, calmness is recognized in animal-assisted therapy and is known to contribute to the improvement of QOL [3]. Based on these fi ndings, it is considered that maintaining or improving QOL is possible in an environment where amusement and calm are encouraged. Using visual images, it is difficult to distinguish between natural facial expressions and those shown intentionally; however, there is no example for defining and observing calmness [4]. The report on emotions by Miyasaka et al. focused on changes in facial skin temperature and accompanying blood fl ow [5] with emphasis on nose temperature, while Kumamoto et al. reported on a method for evaluating stress [6]. There is no system that considers an example of integrated visible and infrared images to evaluate attitude changes. In our previous studies, we showed that movement features of the lip change as the subject's feelings (state of no stress, no vision, etc.,) and physical conditions change, while acquiring motion data on the movement of the lips. We clarifi ed that the presence or absence of psychological changes can be discriminated by the movement features of the lip when amusement is evoked [7]. Vertical motion was observed in the head and shoulders when a strong emotion such as \"amusement\" was expressed. Nonverbal communication, including body movements, contributes to the transmission of approximately 65% of information [8]. That is, combining facial expression, movement features of the lips, and movement features of the body is useful for the recognition and quantification of emotions. In this paper, we develop elementary technologies of multiimage processing systems that can recognize multiple emotions by focusing on the basic study of body movement features to recognize emotion. This study aims to verify whether emotions of amusement can be detected with body movement features by using Microsoft Kinect. Although there are individual differences, this study aims to demonstrate the possibility of using body movement features as indicators of the strength of emotional expressions. Specifi cally, we use the XBOX ONE KINECT SENSOR (Microsoft Corp, Kinect for short), which focuses on nonverbal information such as body movements, as well as human movement in a noncontact fashion while watching an emotion-eliciting video [9]. We acquired threedimensional data on head and shoulder movements while subjects were viewing an emotion-eliciting video, and we conducted studies on the quantifi cation of amusement and calmness. In this paper, we examine the correlation between emotions of \"amusement\" and body movements through multiple methods, using data acquired from multiple subjects. Emotion Discrimination of Amusement Based on Three-Dimensional Data of Body Movements","PeriodicalId":14339,"journal":{"name":"International journal of the Society of Materials Engineering for Resources","volume":"285 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2018-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of the Society of Materials Engineering for Resources","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5188/ijsmer.23.189","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
An important problem associated with the progression of aging in society is the improvement of quality of life (QOL). If we can maintain or improve the QOL for individuals, including the elderly, this will contribute not only to the revitalization and stability of society [1], but also to the reduction of social costs such as nursing care and medical expenses [2]. Therefore, a living support system, such as an acceptable living environment and health monitoring system, is required. There are currently no imaging systems capable of automatically and quantitatively evaluating the factors that improve individual QOL and the intensity thereof in real time. Human QOL and "laughs/smiles" are closely related. Individuals who laugh every day exhibit better health and improved psychological state. In addition, calmness is recognized in animal-assisted therapy and is known to contribute to the improvement of QOL [3]. Based on these fi ndings, it is considered that maintaining or improving QOL is possible in an environment where amusement and calm are encouraged. Using visual images, it is difficult to distinguish between natural facial expressions and those shown intentionally; however, there is no example for defining and observing calmness [4]. The report on emotions by Miyasaka et al. focused on changes in facial skin temperature and accompanying blood fl ow [5] with emphasis on nose temperature, while Kumamoto et al. reported on a method for evaluating stress [6]. There is no system that considers an example of integrated visible and infrared images to evaluate attitude changes. In our previous studies, we showed that movement features of the lip change as the subject's feelings (state of no stress, no vision, etc.,) and physical conditions change, while acquiring motion data on the movement of the lips. We clarifi ed that the presence or absence of psychological changes can be discriminated by the movement features of the lip when amusement is evoked [7]. Vertical motion was observed in the head and shoulders when a strong emotion such as "amusement" was expressed. Nonverbal communication, including body movements, contributes to the transmission of approximately 65% of information [8]. That is, combining facial expression, movement features of the lips, and movement features of the body is useful for the recognition and quantification of emotions. In this paper, we develop elementary technologies of multiimage processing systems that can recognize multiple emotions by focusing on the basic study of body movement features to recognize emotion. This study aims to verify whether emotions of amusement can be detected with body movement features by using Microsoft Kinect. Although there are individual differences, this study aims to demonstrate the possibility of using body movement features as indicators of the strength of emotional expressions. Specifi cally, we use the XBOX ONE KINECT SENSOR (Microsoft Corp, Kinect for short), which focuses on nonverbal information such as body movements, as well as human movement in a noncontact fashion while watching an emotion-eliciting video [9]. We acquired threedimensional data on head and shoulder movements while subjects were viewing an emotion-eliciting video, and we conducted studies on the quantifi cation of amusement and calmness. In this paper, we examine the correlation between emotions of "amusement" and body movements through multiple methods, using data acquired from multiple subjects. Emotion Discrimination of Amusement Based on Three-Dimensional Data of Body Movements
与社会老龄化进程相关的一个重要问题是生活质量(QOL)的提高。如果我们能够保持或提高包括老年人在内的个人的生活质量,这不仅有助于社会的振兴和稳定[1],而且有助于降低护理和医疗费用等社会成本[2]。因此,需要一个生活支持系统,如可接受的生活环境和健康监测系统。目前还没有成像系统能够自动和定量地实时评估改善个人生活质量的因素及其强度。人类的生活质量和“笑/微笑”是密切相关的。每天笑的人身体更健康,心理状态也更好。此外,镇静在动物辅助治疗中得到认可,并有助于改善生活质量[3]。基于这些发现,我们认为在一个鼓励娱乐和平静的环境中,维持或改善生活质量是可能的。使用视觉图像,很难区分自然的面部表情和故意表现的面部表情;然而,并没有定义和观察平静的例子[4]。Miyasaka等人关于情绪的报告关注面部皮肤温度和伴随的血流变化[5],重点关注鼻子温度,而Kumamoto等人则报道了一种评估压力的方法[6]。目前还没有一种系统可以考虑综合可见光和红外图像的例子来评估姿态变化。在我们之前的研究中,我们发现嘴唇的运动特征随着受试者的感觉(无压力、无视觉等状态)和身体状况的变化而变化,同时获取嘴唇运动的运动数据。我们阐明了心理变化的存在与否可以通过引起娱乐时嘴唇的运动特征来区分[7]。当表达强烈的情感,如“娱乐”时,可以观察到头部和肩部的垂直运动。包括肢体动作在内的非语言交际约占信息传递的65%[8]。也就是说,结合面部表情、嘴唇的运动特征和身体的运动特征,对情绪的识别和量化是有用的。本文通过对人体运动特征识别情绪的基础研究,开发了能够识别多种情绪的多图像处理系统的基本技术。本研究旨在验证是否可以使用微软Kinect通过身体运动特征来检测娱乐情绪。尽管存在个体差异,但本研究旨在证明使用身体运动特征作为情绪表达强度指标的可能性。具体来说,我们使用了XBOX ONE KINECT传感器(微软公司,简称KINECT),它专注于非语言信息,如身体运动,以及在观看情感引发视频时以非接触方式进行的人类运动[9]。我们获得了受试者在观看引发情绪的视频时头部和肩部运动的三维数据,并对娱乐和平静进行了量化研究。在本文中,我们通过多种方法,使用来自多个被试的数据来检验“娱乐”情绪与身体动作之间的相关性。基于肢体动作三维数据的娱乐情绪识别