{"title":"体现仪表重新审视","authors":"P. Toiviainen, Emily Carlson","doi":"10.1525/mp.2022.39.3.249","DOIUrl":null,"url":null,"abstract":"Previous research has shown that humans tend to embody musical meter at multiple beat levels during spontaneous dance. This work that been based on identifying typical periodic movement patterns, or eigenmovements, and has relied on time-domain analyses. The current study: 1) presents a novel method of using time-frequency analysis in conjunction with group-level tensor decomposition; 2) compares its results to time-domain analysis, and 3) investigates how the amplitude of eigenmovements depends on musical content and genre. Data comprised three-dimensional motion capture of 72 participants’ spontaneous dance movements to 16 stimuli including eight different genres. Each trial was subjected to a discrete wavelet transform, concatenated into a trial-space-frequency tensor and decomposed using tensor decomposition. Twelve movement primitives, or eigenmovements, were identified, eleven of which were frequency locked with one of four metrical levels. The results suggest that time-frequency decomposition can more efficiently group movement directions together. Furthermore, the employed group-level decomposition allows for a straightforward analysis of interstimulus and interparticipant differences in music-induced movement. Amplitude of eigenmovements was found to depend on the amount of fluctuation in the music in particularly at one- and two-beat levels.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":1.3000,"publicationDate":"2022-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Embodied Meter Revisited\",\"authors\":\"P. Toiviainen, Emily Carlson\",\"doi\":\"10.1525/mp.2022.39.3.249\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Previous research has shown that humans tend to embody musical meter at multiple beat levels during spontaneous dance. This work that been based on identifying typical periodic movement patterns, or eigenmovements, and has relied on time-domain analyses. The current study: 1) presents a novel method of using time-frequency analysis in conjunction with group-level tensor decomposition; 2) compares its results to time-domain analysis, and 3) investigates how the amplitude of eigenmovements depends on musical content and genre. Data comprised three-dimensional motion capture of 72 participants’ spontaneous dance movements to 16 stimuli including eight different genres. Each trial was subjected to a discrete wavelet transform, concatenated into a trial-space-frequency tensor and decomposed using tensor decomposition. Twelve movement primitives, or eigenmovements, were identified, eleven of which were frequency locked with one of four metrical levels. The results suggest that time-frequency decomposition can more efficiently group movement directions together. Furthermore, the employed group-level decomposition allows for a straightforward analysis of interstimulus and interparticipant differences in music-induced movement. Amplitude of eigenmovements was found to depend on the amount of fluctuation in the music in particularly at one- and two-beat levels.\",\"PeriodicalId\":47786,\"journal\":{\"name\":\"Music Perception\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2022-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Music Perception\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1525/mp.2022.39.3.249\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"0\",\"JCRName\":\"MUSIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Music Perception","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1525/mp.2022.39.3.249","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"MUSIC","Score":null,"Total":0}
Previous research has shown that humans tend to embody musical meter at multiple beat levels during spontaneous dance. This work that been based on identifying typical periodic movement patterns, or eigenmovements, and has relied on time-domain analyses. The current study: 1) presents a novel method of using time-frequency analysis in conjunction with group-level tensor decomposition; 2) compares its results to time-domain analysis, and 3) investigates how the amplitude of eigenmovements depends on musical content and genre. Data comprised three-dimensional motion capture of 72 participants’ spontaneous dance movements to 16 stimuli including eight different genres. Each trial was subjected to a discrete wavelet transform, concatenated into a trial-space-frequency tensor and decomposed using tensor decomposition. Twelve movement primitives, or eigenmovements, were identified, eleven of which were frequency locked with one of four metrical levels. The results suggest that time-frequency decomposition can more efficiently group movement directions together. Furthermore, the employed group-level decomposition allows for a straightforward analysis of interstimulus and interparticipant differences in music-induced movement. Amplitude of eigenmovements was found to depend on the amount of fluctuation in the music in particularly at one- and two-beat levels.
期刊介绍:
Music Perception charts the ongoing scholarly discussion and study of musical phenomena. Publishing original empirical and theoretical papers, methodological articles and critical reviews from renowned scientists and musicians, Music Perception is a repository of insightful research. The broad range of disciplines covered in the journal includes: •Psychology •Psychophysics •Linguistics •Neurology •Neurophysiology •Artificial intelligence •Computer technology •Physical and architectural acoustics •Music theory