{"title":"从表达末端执行器轨迹到表达身体动作","authors":"Pamela Carreno-Medrano, S. Gibet, P. Marteau","doi":"10.1145/2915926.2915941","DOIUrl":null,"url":null,"abstract":"Recent results in the affective computing sciences point towards the importance of virtual characters capable of conveying affect through their movements. However, in spite of all advances made on the synthesis of expressive motions, almost all of the existing approaches focus on the translation of stylistic content rather than on the generation of new expressive motions. Based on studies that show the importance of end-effector trajectories in the perception and recognition of affect, this paper proposes a new approach for the automatic generation of affective motions. In this approach, expressive content is embedded in a low-dimensional manifold built from the observation of end-effector trajectories. These trajectories are taken from an expressive motion capture database. Body motions are then reconstructed by a multi-chain Inverse Kinematics controller. The similarity between the expressive content of MoCap and synthesized motions is quantitatively assessed through information theory measures.","PeriodicalId":409915,"journal":{"name":"Proceedings of the 29th International Conference on Computer Animation and Social Agents","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"From Expressive End-Effector Trajectories to Expressive Bodily Motions\",\"authors\":\"Pamela Carreno-Medrano, S. Gibet, P. Marteau\",\"doi\":\"10.1145/2915926.2915941\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recent results in the affective computing sciences point towards the importance of virtual characters capable of conveying affect through their movements. However, in spite of all advances made on the synthesis of expressive motions, almost all of the existing approaches focus on the translation of stylistic content rather than on the generation of new expressive motions. Based on studies that show the importance of end-effector trajectories in the perception and recognition of affect, this paper proposes a new approach for the automatic generation of affective motions. In this approach, expressive content is embedded in a low-dimensional manifold built from the observation of end-effector trajectories. These trajectories are taken from an expressive motion capture database. Body motions are then reconstructed by a multi-chain Inverse Kinematics controller. The similarity between the expressive content of MoCap and synthesized motions is quantitatively assessed through information theory measures.\",\"PeriodicalId\":409915,\"journal\":{\"name\":\"Proceedings of the 29th International Conference on Computer Animation and Social Agents\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-05-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 29th International Conference on Computer Animation and Social Agents\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2915926.2915941\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 29th International Conference on Computer Animation and Social Agents","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2915926.2915941","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
From Expressive End-Effector Trajectories to Expressive Bodily Motions
Recent results in the affective computing sciences point towards the importance of virtual characters capable of conveying affect through their movements. However, in spite of all advances made on the synthesis of expressive motions, almost all of the existing approaches focus on the translation of stylistic content rather than on the generation of new expressive motions. Based on studies that show the importance of end-effector trajectories in the perception and recognition of affect, this paper proposes a new approach for the automatic generation of affective motions. In this approach, expressive content is embedded in a low-dimensional manifold built from the observation of end-effector trajectories. These trajectories are taken from an expressive motion capture database. Body motions are then reconstructed by a multi-chain Inverse Kinematics controller. The similarity between the expressive content of MoCap and synthesized motions is quantitatively assessed through information theory measures.