{"title":"音乐表演的形象化有助于听众的理解","authors":"Rumi Hiraga, N. Matsuda","doi":"10.1145/989863.989878","DOIUrl":null,"url":null,"abstract":"We present a new method for visualizing musical expressions with a special focus on the three major elements of tempo change, dynamics change, and articulation. We have represented tempo change as a horizontal interval delimited by vertical lines, while dynamics change and articulation within the interval are represented by the height and width of a bar, respectively. Then we grouped local expression into several groups by k-means clustering based on the values of the elements. The resulting groups represented the emotional expression in a performance that is controlled by the rhythmic and melodic structure, which controls the gray scale of the graphical components. We ran a pilot experiment to test the effectiveness of our method using two matching tasks and a questionnaire. In the first task, we used the same section of music, played by two different interpretations, while in the second task, two different sections of a performance were used. The results of the test seem to support the present approach, although there is still room for further improvement that will reflect the subtleties in performance.","PeriodicalId":215861,"journal":{"name":"Proceedings of the working conference on Advanced visual interfaces","volume":"729 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-05-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"24","resultStr":"{\"title\":\"Visualization of music performance as an aid to listener's comprehension\",\"authors\":\"Rumi Hiraga, N. Matsuda\",\"doi\":\"10.1145/989863.989878\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We present a new method for visualizing musical expressions with a special focus on the three major elements of tempo change, dynamics change, and articulation. We have represented tempo change as a horizontal interval delimited by vertical lines, while dynamics change and articulation within the interval are represented by the height and width of a bar, respectively. Then we grouped local expression into several groups by k-means clustering based on the values of the elements. The resulting groups represented the emotional expression in a performance that is controlled by the rhythmic and melodic structure, which controls the gray scale of the graphical components. We ran a pilot experiment to test the effectiveness of our method using two matching tasks and a questionnaire. In the first task, we used the same section of music, played by two different interpretations, while in the second task, two different sections of a performance were used. The results of the test seem to support the present approach, although there is still room for further improvement that will reflect the subtleties in performance.\",\"PeriodicalId\":215861,\"journal\":{\"name\":\"Proceedings of the working conference on Advanced visual interfaces\",\"volume\":\"729 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2004-05-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"24\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the working conference on Advanced visual interfaces\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/989863.989878\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the working conference on Advanced visual interfaces","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/989863.989878","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Visualization of music performance as an aid to listener's comprehension
We present a new method for visualizing musical expressions with a special focus on the three major elements of tempo change, dynamics change, and articulation. We have represented tempo change as a horizontal interval delimited by vertical lines, while dynamics change and articulation within the interval are represented by the height and width of a bar, respectively. Then we grouped local expression into several groups by k-means clustering based on the values of the elements. The resulting groups represented the emotional expression in a performance that is controlled by the rhythmic and melodic structure, which controls the gray scale of the graphical components. We ran a pilot experiment to test the effectiveness of our method using two matching tasks and a questionnaire. In the first task, we used the same section of music, played by two different interpretations, while in the second task, two different sections of a performance were used. The results of the test seem to support the present approach, although there is still room for further improvement that will reflect the subtleties in performance.