{"title":"智能手机评估的运动预测音乐属性:通过加速计运动数据将具体化的音乐认知整合到音乐推荐服务中","authors":"M. Irrgang, J. Steffens, Hauke Egermann","doi":"10.1145/3212721.3212852","DOIUrl":null,"url":null,"abstract":"Numerous studies have shown a close relationship between movement and music [7], [17], [11], [14], [16], [3], [8]. That is why Leman calls for new mediation technologies to query music in a corporeal way [9]. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, regularity, and direction were extracted from accelerometer data to predict the musical qualities \"rhythmicity\", \"pitch level + range\" and \"complexity\" assessed by three music experts. Motion features selected by a 20-fold lasso predicted the musical properties to the following degrees \"rhythmicity\" (R2: .47), pitch level and range (R2: .03) and complexity (R2: .10). As a consequence, we conclude that music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.","PeriodicalId":330867,"journal":{"name":"Proceedings of the 5th International Conference on Movement and Computing","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Smartphone-Assessed Movement Predicts Music Properties: Towards Integrating Embodied Music Cognition into Music Recommender Services via Accelerometer Motion Data\",\"authors\":\"M. Irrgang, J. Steffens, Hauke Egermann\",\"doi\":\"10.1145/3212721.3212852\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Numerous studies have shown a close relationship between movement and music [7], [17], [11], [14], [16], [3], [8]. That is why Leman calls for new mediation technologies to query music in a corporeal way [9]. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, regularity, and direction were extracted from accelerometer data to predict the musical qualities \\\"rhythmicity\\\", \\\"pitch level + range\\\" and \\\"complexity\\\" assessed by three music experts. Motion features selected by a 20-fold lasso predicted the musical properties to the following degrees \\\"rhythmicity\\\" (R2: .47), pitch level and range (R2: .03) and complexity (R2: .10). As a consequence, we conclude that music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.\",\"PeriodicalId\":330867,\"journal\":{\"name\":\"Proceedings of the 5th International Conference on Movement and Computing\",\"volume\":\"4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-06-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 5th International Conference on Movement and Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3212721.3212852\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 5th International Conference on Movement and Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3212721.3212852","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Smartphone-Assessed Movement Predicts Music Properties: Towards Integrating Embodied Music Cognition into Music Recommender Services via Accelerometer Motion Data
Numerous studies have shown a close relationship between movement and music [7], [17], [11], [14], [16], [3], [8]. That is why Leman calls for new mediation technologies to query music in a corporeal way [9]. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, regularity, and direction were extracted from accelerometer data to predict the musical qualities "rhythmicity", "pitch level + range" and "complexity" assessed by three music experts. Motion features selected by a 20-fold lasso predicted the musical properties to the following degrees "rhythmicity" (R2: .47), pitch level and range (R2: .03) and complexity (R2: .10). As a consequence, we conclude that music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible.