乐感的全脑脑电图分析

IF 1.3 2区 心理学 0 MUSIC Music Perception Pub Date : 2019-09-01 DOI:10.1525/mp.2019.37.1.42
Estela Ribeiro, C. Thomaz
{"title":"乐感的全脑脑电图分析","authors":"Estela Ribeiro, C. Thomaz","doi":"10.1525/mp.2019.37.1.42","DOIUrl":null,"url":null,"abstract":"The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.","PeriodicalId":47786,"journal":{"name":"Music Perception","volume":null,"pages":null},"PeriodicalIF":1.3000,"publicationDate":"2019-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1525/mp.2019.37.1.42","citationCount":"3","resultStr":"{\"title\":\"A Whole Brain EEG Analysis of Musicianship\",\"authors\":\"Estela Ribeiro, C. Thomaz\",\"doi\":\"10.1525/mp.2019.37.1.42\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.\",\"PeriodicalId\":47786,\"journal\":{\"name\":\"Music Perception\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.3000,\"publicationDate\":\"2019-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://sci-hub-pdf.com/10.1525/mp.2019.37.1.42\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Music Perception\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://doi.org/10.1525/mp.2019.37.1.42\",\"RegionNum\":2,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"0\",\"JCRName\":\"MUSIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Music Perception","FirstCategoryId":"102","ListUrlMain":"https://doi.org/10.1525/mp.2019.37.1.42","RegionNum":2,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"MUSIC","Score":null,"Total":0}
引用次数: 3

摘要

听音乐引起的神经激活模式可以揭示受试者是否接受了音乐训练。在目前的探索性研究中,我们通过一个由以下步骤组成的计算框架来解决这个两组(音乐家和非音乐家)分类问题:声学特征提取;声学特征选择;触发选择;脑电信号处理;多元统计分析。我们特别感兴趣的是在全球范围内分析大脑数据,考虑到在给定时间瞬间脑电图(EEG)信号中记录的大脑活动。我们的实验结果——26名志愿者(13名音乐家和13名非音乐家)听了约翰内斯·勃拉姆斯的古典音乐《匈牙利舞曲第5号》——表明,尽管可用的样本量有限,但线性区分音乐家和非音乐家的分类准确率在69.2%(测试集)到93.8%(训练集)之间是可能的。此外,考虑到本文描述和实现的全脑矢量导航方法,我们的研究结果表明,根据从音频中提取的声学特征,可以突出参与者大脑活动模式中最具表现力和区别性的变化。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A Whole Brain EEG Analysis of Musicianship
The neural activation patterns provoked in response to music listening can reveal whether a subject did or did not receive music training. In the current exploratory study, we have approached this two-group (musicians and nonmusicians) classification problem through a computational framework composed of the following steps: Acoustic features extraction; Acoustic features selection; Trigger selection; EEG signal processing; and Multivariate statistical analysis. We are particularly interested in analyzing the brain data on a global level, considering its activity registered in electroencephalogram (EEG) signals on a given time instant. Our experiment's results—with 26 volunteers (13 musicians and 13 nonmusicians) who listened the classical music Hungarian Dance No. 5 from Johannes Brahms—have shown that is possible to linearly differentiate musicians and nonmusicians with classification accuracies that range from 69.2% (test set) to 93.8% (training set), despite the limited sample sizes available. Additionally, given the whole brain vector navigation method described and implemented here, our results suggest that it is possible to highlight the most expressive and discriminant changes in the participants brain activity patterns depending on the acoustic feature extracted from the audio.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Music Perception
Music Perception Multiple-
CiteScore
3.70
自引率
4.30%
发文量
22
期刊介绍: Music Perception charts the ongoing scholarly discussion and study of musical phenomena. Publishing original empirical and theoretical papers, methodological articles and critical reviews from renowned scientists and musicians, Music Perception is a repository of insightful research. The broad range of disciplines covered in the journal includes: •Psychology •Psychophysics •Linguistics •Neurology •Neurophysiology •Artificial intelligence •Computer technology •Physical and architectural acoustics •Music theory
期刊最新文献
The Perceptual and Emotional Consequences of Articulation in Music Cognitive Mechanisms in Temporally Controlled Rhythm Reading No Heightened Musical Pitch Weighting For Tone Language Speakers in Early Childhood Timbre Semantic Associations Vary Both Between and Within Instruments The Whole is Not Different From its Parts
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1