Xuedan Gao, Amit Rogel, Raghavasimhan Sankaranarayanan, Brody Dowling, Gil Weinberg
{"title":"音乐、身体和机器:基于手势的人机音乐互动同步。","authors":"Xuedan Gao, Amit Rogel, Raghavasimhan Sankaranarayanan, Brody Dowling, Gil Weinberg","doi":"10.3389/frobt.2024.1461615","DOIUrl":null,"url":null,"abstract":"<p><p>Musical performance relies on nonverbal cues for conveying information among musicians. Human musicians use bodily gestures to communicate their interpretation and intentions to their collaborators, from mood and expression to anticipatory cues regarding structure and tempo. Robotic Musicians can use their physical bodies in a similar way when interacting with fellow musicians. The paper presents a new theoretical framework to classify musical gestures and a study evaluating the effect of robotic gestures on synchronization between human musicians and Shimon - a robotic marimba player developed at Georgia Tech. Shimon utilizes head and arm movements to signify musical information such as expected notes, tempo, and beat. The study, in which piano players were asked to play along with Shimon, assessed the effectiveness of these gestures on human-robot synchronization. Subjects were evaluated for their ability to synchronize with unknown tempo changes as communicated by Shimon's ancillary and social gestures. The results demonstrate the significant contribution of non-instrumental gestures to human-robot synchronization, highlighting the importance of non-music-making gestures for anticipation and coordination in human-robot musical collaboration. Subjects also indicated more positive feelings when interacting with the robot's ancillary and social gestures, indicating the role of these gestures in supporting engaging and enjoyable musical experiences.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":"11 ","pages":"1461615"},"PeriodicalIF":2.9000,"publicationDate":"2024-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11655300/pdf/","citationCount":"0","resultStr":"{\"title\":\"Music, body, and machine: gesture-based synchronization in human-robot musical interaction.\",\"authors\":\"Xuedan Gao, Amit Rogel, Raghavasimhan Sankaranarayanan, Brody Dowling, Gil Weinberg\",\"doi\":\"10.3389/frobt.2024.1461615\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Musical performance relies on nonverbal cues for conveying information among musicians. Human musicians use bodily gestures to communicate their interpretation and intentions to their collaborators, from mood and expression to anticipatory cues regarding structure and tempo. Robotic Musicians can use their physical bodies in a similar way when interacting with fellow musicians. The paper presents a new theoretical framework to classify musical gestures and a study evaluating the effect of robotic gestures on synchronization between human musicians and Shimon - a robotic marimba player developed at Georgia Tech. Shimon utilizes head and arm movements to signify musical information such as expected notes, tempo, and beat. The study, in which piano players were asked to play along with Shimon, assessed the effectiveness of these gestures on human-robot synchronization. Subjects were evaluated for their ability to synchronize with unknown tempo changes as communicated by Shimon's ancillary and social gestures. The results demonstrate the significant contribution of non-instrumental gestures to human-robot synchronization, highlighting the importance of non-music-making gestures for anticipation and coordination in human-robot musical collaboration. Subjects also indicated more positive feelings when interacting with the robot's ancillary and social gestures, indicating the role of these gestures in supporting engaging and enjoyable musical experiences.</p>\",\"PeriodicalId\":47597,\"journal\":{\"name\":\"Frontiers in Robotics and AI\",\"volume\":\"11 \",\"pages\":\"1461615\"},\"PeriodicalIF\":2.9000,\"publicationDate\":\"2024-12-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11655300/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Robotics and AI\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3389/frobt.2024.1461615\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q2\",\"JCRName\":\"ROBOTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Robotics and AI","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frobt.2024.1461615","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
Music, body, and machine: gesture-based synchronization in human-robot musical interaction.
Musical performance relies on nonverbal cues for conveying information among musicians. Human musicians use bodily gestures to communicate their interpretation and intentions to their collaborators, from mood and expression to anticipatory cues regarding structure and tempo. Robotic Musicians can use their physical bodies in a similar way when interacting with fellow musicians. The paper presents a new theoretical framework to classify musical gestures and a study evaluating the effect of robotic gestures on synchronization between human musicians and Shimon - a robotic marimba player developed at Georgia Tech. Shimon utilizes head and arm movements to signify musical information such as expected notes, tempo, and beat. The study, in which piano players were asked to play along with Shimon, assessed the effectiveness of these gestures on human-robot synchronization. Subjects were evaluated for their ability to synchronize with unknown tempo changes as communicated by Shimon's ancillary and social gestures. The results demonstrate the significant contribution of non-instrumental gestures to human-robot synchronization, highlighting the importance of non-music-making gestures for anticipation and coordination in human-robot musical collaboration. Subjects also indicated more positive feelings when interacting with the robot's ancillary and social gestures, indicating the role of these gestures in supporting engaging and enjoyable musical experiences.
期刊介绍:
Frontiers in Robotics and AI publishes rigorously peer-reviewed research covering all theory and applications of robotics, technology, and artificial intelligence, from biomedical to space robotics.