Emily Carlson, Pasi Saari, Birgitta Burger, P. Toiviainen
{"title":"跟着自己的鼓跳舞:使用机器学习从动作捕捉中识别音乐类型和舞者个体","authors":"Emily Carlson, Pasi Saari, Birgitta Burger, P. Toiviainen","doi":"10.1080/09298215.2020.1711778","DOIUrl":null,"url":null,"abstract":"ABSTRACT Machine learning has been used to accurately classify musical genre using features derived from audio signals. Musical genre, as well as lower-level audio features of music, have also been shown to influence music-induced movement, however, the degree to which such movements are genre-specific has not been explored. The current paper addresses this using motion capture data from participants dancing freely to eight genres. Using a Support Vector Machine model, data were classified by genre and by individual dancer. Against expectations, individual classification was notably more accurate than genre classification. Results are discussed in terms of embodied cognition and culture.","PeriodicalId":16553,"journal":{"name":"Journal of New Music Research","volume":"49 1","pages":"162 - 177"},"PeriodicalIF":1.1000,"publicationDate":"2020-01-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/09298215.2020.1711778","citationCount":"23","resultStr":"{\"title\":\"Dance to your own drum: Identification of musical genre and individual dancer from motion capture using machine learning\",\"authors\":\"Emily Carlson, Pasi Saari, Birgitta Burger, P. Toiviainen\",\"doi\":\"10.1080/09298215.2020.1711778\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT Machine learning has been used to accurately classify musical genre using features derived from audio signals. Musical genre, as well as lower-level audio features of music, have also been shown to influence music-induced movement, however, the degree to which such movements are genre-specific has not been explored. The current paper addresses this using motion capture data from participants dancing freely to eight genres. Using a Support Vector Machine model, data were classified by genre and by individual dancer. Against expectations, individual classification was notably more accurate than genre classification. Results are discussed in terms of embodied cognition and culture.\",\"PeriodicalId\":16553,\"journal\":{\"name\":\"Journal of New Music Research\",\"volume\":\"49 1\",\"pages\":\"162 - 177\"},\"PeriodicalIF\":1.1000,\"publicationDate\":\"2020-01-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1080/09298215.2020.1711778\",\"citationCount\":\"23\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of New Music Research\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1080/09298215.2020.1711778\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of New Music Research","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1080/09298215.2020.1711778","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Dance to your own drum: Identification of musical genre and individual dancer from motion capture using machine learning
ABSTRACT Machine learning has been used to accurately classify musical genre using features derived from audio signals. Musical genre, as well as lower-level audio features of music, have also been shown to influence music-induced movement, however, the degree to which such movements are genre-specific has not been explored. The current paper addresses this using motion capture data from participants dancing freely to eight genres. Using a Support Vector Machine model, data were classified by genre and by individual dancer. Against expectations, individual classification was notably more accurate than genre classification. Results are discussed in terms of embodied cognition and culture.
期刊介绍:
The Journal of New Music Research (JNMR) publishes material which increases our understanding of music and musical processes by systematic, scientific and technological means. Research published in the journal is innovative, empirically grounded and often, but not exclusively, uses quantitative methods. Articles are both musically relevant and scientifically rigorous, giving full technical details. No bounds are placed on the music or musical behaviours at issue: popular music, music of diverse cultures and the canon of western classical music are all within the Journal’s scope. Articles deal with theory, analysis, composition, performance, uses of music, instruments and other music technologies. The Journal was founded in 1972 with the original title Interface to reflect its interdisciplinary nature, drawing on musicology (including music theory), computer science, psychology, acoustics, philosophy, and other disciplines.