{"title":"Classifying Cultural Music using Melodic Features","authors":"Amruta Vidwans, Prateek Verma, P. Rao","doi":"10.1109/SPCOM50965.2020.9179597","DOIUrl":null,"url":null,"abstract":"We present melody based classification of musical styles by exploiting pitch and energy based characteristics computed on the audio signal. Three prominent musical styles were chosen which have improvisation as an integral part with similar melodic principles, theme, and structure of concerts namely, Hindustani, Carnatic and Turkish music. Listeners of one or more of these genres can discriminate these entirely based on the melodic style. The resynthesized melody of music pieces that share the underlying raga/makam, removing any singer cues, was used to validate our hypothesis that style distinction is embedded in the melody. Our automatic method is based on finding a set of highly discriminatory features, motivated by musicological knowledge, to capture distinct characteristics of the melodic contour. The nature of transitions in the pitch contour, presence of microtonal notes and the dynamic variations in the vocal energy are exploited. The automatically classified style labels are found to correlate well with the judgments of human listeners. The melody based features when combined with timbre based features, were found to improve the classification performance on the music metadata based genre labels.","PeriodicalId":208527,"journal":{"name":"2020 International Conference on Signal Processing and Communications (SPCOM)","volume":"152 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Signal Processing and Communications (SPCOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPCOM50965.2020.9179597","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
We present melody based classification of musical styles by exploiting pitch and energy based characteristics computed on the audio signal. Three prominent musical styles were chosen which have improvisation as an integral part with similar melodic principles, theme, and structure of concerts namely, Hindustani, Carnatic and Turkish music. Listeners of one or more of these genres can discriminate these entirely based on the melodic style. The resynthesized melody of music pieces that share the underlying raga/makam, removing any singer cues, was used to validate our hypothesis that style distinction is embedded in the melody. Our automatic method is based on finding a set of highly discriminatory features, motivated by musicological knowledge, to capture distinct characteristics of the melodic contour. The nature of transitions in the pitch contour, presence of microtonal notes and the dynamic variations in the vocal energy are exploited. The automatically classified style labels are found to correlate well with the judgments of human listeners. The melody based features when combined with timbre based features, were found to improve the classification performance on the music metadata based genre labels.