Juan-Daniel Galeano-Otálvaro, Jordi Martorell, Lars Meyer, Lorenzo Titone
{"title":"跨脑电频段的音乐旋律预期神经编码。","authors":"Juan-Daniel Galeano-Otálvaro, Jordi Martorell, Lars Meyer, Lorenzo Titone","doi":"10.1111/ejn.16581","DOIUrl":null,"url":null,"abstract":"<p><p>The human brain tracks regularities in the environment and extrapolates these to predict future events. Prior work on music cognition suggests that low-frequency (1-8 Hz) brain activity encodes melodic predictions beyond the stimulus acoustics. Building on this work, we aimed to disentangle the frequency-specific neural dynamics linked to melodic prediction uncertainty (modelled as entropy) and prediction error (modelled as surprisal) for temporal (note onset) and content (note pitch) information. By using multivariate temporal response function (TRF) models, we re-analysed the electroencephalogram (EEG) from 20 subjects (10 musicians) who listened to Western tonal music. Our results show that melodic expectation metrics improve the EEG reconstruction accuracy in all frequency bands below the gamma range (< 30 Hz). Crucially, we found that entropy contributed more strongly to the reconstruction accuracy enhancement compared to surprisal in all frequency bands. Additionally, we found that the encoding of temporal, but not content, information metrics was not limited to low frequencies, rather it extended to higher frequencies (> 8 Hz). An analysis of the TRF weights revealed that the temporal predictability of a note (entropy of note onset) may be encoded in the delta- (1-4 Hz) and beta-band (12-30 Hz) brain activity prior to the stimulus, suggesting that these frequency bands associate with temporal predictions. Strikingly, we also revealed that melodic expectations selectively enhanced EEG reconstruction accuracy in the beta band for musicians, and in the alpha band (8-12 Hz) for non-musicians, suggesting that musical expertise influences the neural dynamics underlying predictive processing in music cognition.</p>","PeriodicalId":11993,"journal":{"name":"European Journal of Neuroscience","volume":" ","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2024-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Neural encoding of melodic expectations in music across EEG frequency bands.\",\"authors\":\"Juan-Daniel Galeano-Otálvaro, Jordi Martorell, Lars Meyer, Lorenzo Titone\",\"doi\":\"10.1111/ejn.16581\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The human brain tracks regularities in the environment and extrapolates these to predict future events. Prior work on music cognition suggests that low-frequency (1-8 Hz) brain activity encodes melodic predictions beyond the stimulus acoustics. Building on this work, we aimed to disentangle the frequency-specific neural dynamics linked to melodic prediction uncertainty (modelled as entropy) and prediction error (modelled as surprisal) for temporal (note onset) and content (note pitch) information. By using multivariate temporal response function (TRF) models, we re-analysed the electroencephalogram (EEG) from 20 subjects (10 musicians) who listened to Western tonal music. Our results show that melodic expectation metrics improve the EEG reconstruction accuracy in all frequency bands below the gamma range (< 30 Hz). Crucially, we found that entropy contributed more strongly to the reconstruction accuracy enhancement compared to surprisal in all frequency bands. Additionally, we found that the encoding of temporal, but not content, information metrics was not limited to low frequencies, rather it extended to higher frequencies (> 8 Hz). An analysis of the TRF weights revealed that the temporal predictability of a note (entropy of note onset) may be encoded in the delta- (1-4 Hz) and beta-band (12-30 Hz) brain activity prior to the stimulus, suggesting that these frequency bands associate with temporal predictions. Strikingly, we also revealed that melodic expectations selectively enhanced EEG reconstruction accuracy in the beta band for musicians, and in the alpha band (8-12 Hz) for non-musicians, suggesting that musical expertise influences the neural dynamics underlying predictive processing in music cognition.</p>\",\"PeriodicalId\":11993,\"journal\":{\"name\":\"European Journal of Neuroscience\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2024-10-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"European Journal of Neuroscience\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1111/ejn.16581\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"NEUROSCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1111/ejn.16581","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"NEUROSCIENCES","Score":null,"Total":0}
Neural encoding of melodic expectations in music across EEG frequency bands.
The human brain tracks regularities in the environment and extrapolates these to predict future events. Prior work on music cognition suggests that low-frequency (1-8 Hz) brain activity encodes melodic predictions beyond the stimulus acoustics. Building on this work, we aimed to disentangle the frequency-specific neural dynamics linked to melodic prediction uncertainty (modelled as entropy) and prediction error (modelled as surprisal) for temporal (note onset) and content (note pitch) information. By using multivariate temporal response function (TRF) models, we re-analysed the electroencephalogram (EEG) from 20 subjects (10 musicians) who listened to Western tonal music. Our results show that melodic expectation metrics improve the EEG reconstruction accuracy in all frequency bands below the gamma range (< 30 Hz). Crucially, we found that entropy contributed more strongly to the reconstruction accuracy enhancement compared to surprisal in all frequency bands. Additionally, we found that the encoding of temporal, but not content, information metrics was not limited to low frequencies, rather it extended to higher frequencies (> 8 Hz). An analysis of the TRF weights revealed that the temporal predictability of a note (entropy of note onset) may be encoded in the delta- (1-4 Hz) and beta-band (12-30 Hz) brain activity prior to the stimulus, suggesting that these frequency bands associate with temporal predictions. Strikingly, we also revealed that melodic expectations selectively enhanced EEG reconstruction accuracy in the beta band for musicians, and in the alpha band (8-12 Hz) for non-musicians, suggesting that musical expertise influences the neural dynamics underlying predictive processing in music cognition.
期刊介绍:
EJN is the journal of FENS and supports the international neuroscientific community by publishing original high quality research articles and reviews in all fields of neuroscience. In addition, to engage with issues that are of interest to the science community, we also publish Editorials, Meetings Reports and Neuro-Opinions on topics that are of current interest in the fields of neuroscience research and training in science. We have recently established a series of ‘Profiles of Women in Neuroscience’. Our goal is to provide a vehicle for publications that further the understanding of the structure and function of the nervous system in both health and disease and to provide a vehicle to engage the neuroscience community. As the official journal of FENS, profits from the journal are re-invested in the neuroscientific community through the activities of FENS.