Midori Watanabe, Narumi Kuroko, Hayato Ohya, T. Itoh
{"title":"歌曲集元数据与声学特征值关系的可视化研究*","authors":"Midori Watanabe, Narumi Kuroko, Hayato Ohya, T. Itoh","doi":"10.1109/NicoInt55861.2022.00023","DOIUrl":null,"url":null,"abstract":"Research and services on automatic music classification and recommendation have been active in recent years. Here, it is often unclear what kind of metadata and acoustic features strongly contribute to the feasibility of music classification and recommendation. Based on this discussion, we are working on the visualization of music pieces with metadata, acoustic features, machine learning methods, and visualization methods that are effective for music classification tasks, exploring whether new relationships between acoustic features and metadata can be discovered through visualization. Specifically, we calculated the acoustic features of a set of songs using music analysis tools and machine learning techniques, and visualized the distribution of the acoustic features and metadata. In this paper, we present the experimental results visualizing the relationship between acoustic features and metadata including released year, composer name, and artist name.","PeriodicalId":328114,"journal":{"name":"2022 Nicograph International (NicoInt)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Visualization of the Relationship between Metadata and Acoustic Feature Values of Song Collections*\",\"authors\":\"Midori Watanabe, Narumi Kuroko, Hayato Ohya, T. Itoh\",\"doi\":\"10.1109/NicoInt55861.2022.00023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Research and services on automatic music classification and recommendation have been active in recent years. Here, it is often unclear what kind of metadata and acoustic features strongly contribute to the feasibility of music classification and recommendation. Based on this discussion, we are working on the visualization of music pieces with metadata, acoustic features, machine learning methods, and visualization methods that are effective for music classification tasks, exploring whether new relationships between acoustic features and metadata can be discovered through visualization. Specifically, we calculated the acoustic features of a set of songs using music analysis tools and machine learning techniques, and visualized the distribution of the acoustic features and metadata. In this paper, we present the experimental results visualizing the relationship between acoustic features and metadata including released year, composer name, and artist name.\",\"PeriodicalId\":328114,\"journal\":{\"name\":\"2022 Nicograph International (NicoInt)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 Nicograph International (NicoInt)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NicoInt55861.2022.00023\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 Nicograph International (NicoInt)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NicoInt55861.2022.00023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Visualization of the Relationship between Metadata and Acoustic Feature Values of Song Collections*
Research and services on automatic music classification and recommendation have been active in recent years. Here, it is often unclear what kind of metadata and acoustic features strongly contribute to the feasibility of music classification and recommendation. Based on this discussion, we are working on the visualization of music pieces with metadata, acoustic features, machine learning methods, and visualization methods that are effective for music classification tasks, exploring whether new relationships between acoustic features and metadata can be discovered through visualization. Specifically, we calculated the acoustic features of a set of songs using music analysis tools and machine learning techniques, and visualized the distribution of the acoustic features and metadata. In this paper, we present the experimental results visualizing the relationship between acoustic features and metadata including released year, composer name, and artist name.