{"title":"数据驱动的声音设计增强机器人动作表现力的感知","authors":"Luke Dahl, Jon Bellona, Lin Bai, A. LaViers","doi":"10.1145/3077981.3078047","DOIUrl":null,"url":null,"abstract":"Since people communicate intentions and inner states through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed \"expressive.\" However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents a study to find a qualitative mapping between movement and sound. Musicians were asked to vocalize sounds in response to animations of a simple simulated upper body movement performed with different movement qualities, parametrized according to Laban's Effort System. Qualitative labelling and quantitative signal analysis of these sounds suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented and analyzed here to set up future work that will test user perceptions when expressive movements and sounds are used in conjunction.","PeriodicalId":206209,"journal":{"name":"Proceedings of the 4th International Conference on Movement Computing","volume":"92 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Data-Driven Design of Sound for Enhancing the Perception of Expressive Robotic Movement\",\"authors\":\"Luke Dahl, Jon Bellona, Lin Bai, A. LaViers\",\"doi\":\"10.1145/3077981.3078047\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Since people communicate intentions and inner states through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed \\\"expressive.\\\" However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents a study to find a qualitative mapping between movement and sound. Musicians were asked to vocalize sounds in response to animations of a simple simulated upper body movement performed with different movement qualities, parametrized according to Laban's Effort System. Qualitative labelling and quantitative signal analysis of these sounds suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented and analyzed here to set up future work that will test user perceptions when expressive movements and sounds are used in conjunction.\",\"PeriodicalId\":206209,\"journal\":{\"name\":\"Proceedings of the 4th International Conference on Movement Computing\",\"volume\":\"92 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 4th International Conference on Movement Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3077981.3078047\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Conference on Movement Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3077981.3078047","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Data-Driven Design of Sound for Enhancing the Perception of Expressive Robotic Movement
Since people communicate intentions and inner states through movement, robots can better interact with humans if they too can modify their movements to communicate changing state. These movements, which may be seen as supplementary to those required for workspace tasks, may be termed "expressive." However, robot hardware, which cannot recreate the same range of dynamics as human limbs, often limit expressive capacity. One solution is to augment expressive robotic movement with expressive sound. To that end, this paper presents a study to find a qualitative mapping between movement and sound. Musicians were asked to vocalize sounds in response to animations of a simple simulated upper body movement performed with different movement qualities, parametrized according to Laban's Effort System. Qualitative labelling and quantitative signal analysis of these sounds suggests a number of correspondences between movement qualities and sound qualities. These correspondences are presented and analyzed here to set up future work that will test user perceptions when expressive movements and sounds are used in conjunction.