Probing the neural dynamics of musicians’ and non-musicians’ consonant/dissonant perception: Joint analyses of electrical encephalogram (EEG) and functional magnetic resonance imaging (fMRI)
{"title":"Probing the neural dynamics of musicians’ and non-musicians’ consonant/dissonant perception: Joint analyses of electrical encephalogram (EEG) and functional magnetic resonance imaging (fMRI)","authors":"","doi":"10.1016/j.neuroimage.2024.120784","DOIUrl":null,"url":null,"abstract":"<div><p>The perception of two (or more) simultaneous musical notes, depending on their pitch interval(s), could be broadly categorized as consonant or dissonant. Previous literature has suggested that musicians and non-musicians adopt different strategies when discerning music intervals: while musicians rely on the frequency ratios between the two fundamental frequencies, such as “perfect fifth” (3:2) as consonant and “tritone” (45:32) as dissonant intervals; non-musicians may rely on the presence of ‘roughness’ or ‘beats’, generated by the difference of fundamental frequencies, as the key elements of ‘dissonance’. The separate Event-Related Potential (ERP) differences in N1 and P2 along the midline electrodes provided evidence congruent with such ‘separate reliances’. To replicate and to extend, in this study we reran the previous experiment, and separately collected fMRI data of the same protocol (with sparse sampling modifications). The behavioral and EEG results largely corresponded to our previous finding. The fMRI results, with the joint analyses by univariate, psycho-physiological interaction, and representational similarity analysis (RSA) approaches, further reinforce the involvement of central midline-related brain regions, such as ventromedial prefrontal and dorsal anterior cingulate cortex, in consonant/dissonance judgments. The final spatiotemporal searchlight RSA provided convincing evidence that the medial prefrontal cortex, along with the bilateral superior temporal cortex, is the joint locus of midline N1 and dorsal anterior cingulate cortex for the P2 effect (for musicians). Together, these analyses reaffirm that musicians rely more on experience-driven knowledge for consonance/dissonance perception; but also demonstrate the advantages of multiple analyses in constraining the findings from both EEG and fMRI.</p></div>","PeriodicalId":19299,"journal":{"name":"NeuroImage","volume":null,"pages":null},"PeriodicalIF":4.7000,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S1053811924002817/pdfft?md5=07cd252f94858c884e3cfc6b12136c24&pid=1-s2.0-S1053811924002817-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"NeuroImage","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1053811924002817","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"NEUROIMAGING","Score":null,"Total":0}
引用次数: 0
Abstract
The perception of two (or more) simultaneous musical notes, depending on their pitch interval(s), could be broadly categorized as consonant or dissonant. Previous literature has suggested that musicians and non-musicians adopt different strategies when discerning music intervals: while musicians rely on the frequency ratios between the two fundamental frequencies, such as “perfect fifth” (3:2) as consonant and “tritone” (45:32) as dissonant intervals; non-musicians may rely on the presence of ‘roughness’ or ‘beats’, generated by the difference of fundamental frequencies, as the key elements of ‘dissonance’. The separate Event-Related Potential (ERP) differences in N1 and P2 along the midline electrodes provided evidence congruent with such ‘separate reliances’. To replicate and to extend, in this study we reran the previous experiment, and separately collected fMRI data of the same protocol (with sparse sampling modifications). The behavioral and EEG results largely corresponded to our previous finding. The fMRI results, with the joint analyses by univariate, psycho-physiological interaction, and representational similarity analysis (RSA) approaches, further reinforce the involvement of central midline-related brain regions, such as ventromedial prefrontal and dorsal anterior cingulate cortex, in consonant/dissonance judgments. The final spatiotemporal searchlight RSA provided convincing evidence that the medial prefrontal cortex, along with the bilateral superior temporal cortex, is the joint locus of midline N1 and dorsal anterior cingulate cortex for the P2 effect (for musicians). Together, these analyses reaffirm that musicians rely more on experience-driven knowledge for consonance/dissonance perception; but also demonstrate the advantages of multiple analyses in constraining the findings from both EEG and fMRI.
期刊介绍:
NeuroImage, a Journal of Brain Function provides a vehicle for communicating important advances in acquiring, analyzing, and modelling neuroimaging data and in applying these techniques to the study of structure-function and brain-behavior relationships. Though the emphasis is on the macroscopic level of human brain organization, meso-and microscopic neuroimaging across all species will be considered if informative for understanding the aforementioned relationships.