{"title":"A chord distance metric based on the Tonal Pitch Space and a key-finding method for chord annotation sequences","authors":"Lucas Marques","doi":"10.5753/sbcm.2019.10435","DOIUrl":null,"url":null,"abstract":"Music Information Retrieval (MIR) is a growing field of research concerned about recovering and generating useful information about music in general. One classic problem of MIR is key-finding, which could be described as the activity of finding the most stable tone and mode of a determined musical piece or a fragment of it. This problem, however, is usually modeled for audio as an input, sometimes MIDI, but little attention seems to be given to approaches considering musical notations and musictheory. This paper will present a method of key-finding that has chord annotations as its only input. A new metric is proposed for calculating distances between tonal pitch spaces and chords, which will be later used to create a key-finding method for chord annotations sequences. We achieve a success rate from 77.85% up to 88.75% for the whole database, depending on whether or not and how some parameters of approximation are configured. We argue that musical-theoretical approaches independent of audio could still bring progress to the MIR area and definitely could be used as complementary techniques.","PeriodicalId":338771,"journal":{"name":"Anais do Simpósio Brasileiro de Computação Musical (SBCM 2019)","volume":"108 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Anais do Simpósio Brasileiro de Computação Musical (SBCM 2019)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5753/sbcm.2019.10435","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Music Information Retrieval (MIR) is a growing field of research concerned about recovering and generating useful information about music in general. One classic problem of MIR is key-finding, which could be described as the activity of finding the most stable tone and mode of a determined musical piece or a fragment of it. This problem, however, is usually modeled for audio as an input, sometimes MIDI, but little attention seems to be given to approaches considering musical notations and musictheory. This paper will present a method of key-finding that has chord annotations as its only input. A new metric is proposed for calculating distances between tonal pitch spaces and chords, which will be later used to create a key-finding method for chord annotations sequences. We achieve a success rate from 77.85% up to 88.75% for the whole database, depending on whether or not and how some parameters of approximation are configured. We argue that musical-theoretical approaches independent of audio could still bring progress to the MIR area and definitely could be used as complementary techniques.
音乐信息检索(Music Information Retrieval, MIR)是一个新兴的研究领域,它关注于音乐信息的恢复和生成。MIR的一个经典问题是找键,它可以被描述为为确定的音乐作品或其片段找到最稳定的音调和模式的活动。然而,这个问题通常是将音频建模为输入,有时是MIDI,但似乎很少注意到考虑音乐符号和音乐理论的方法。本文将介绍一种以和弦注释为唯一输入的键查找方法。提出了一种计算音高空间与和弦之间距离的新度量,该度量将用于创建和弦注释序列的寻键方法。我们在整个数据库中实现了从77.85%到88.75%的成功率,这取决于是否以及如何配置一些近似参数。我们认为,独立于音频的音乐理论方法仍然可以为MIR领域带来进步,并且绝对可以用作补充技术。