{"title":"Emotion Recognition of Playing Musicians From EEG, ECG, and Acoustic Signals","authors":"Luca Turchet;Barry O'Sullivan;Rupert Ortner;Christoph Guger","doi":"10.1109/THMS.2024.3430327","DOIUrl":null,"url":null,"abstract":"This article investigated the automatic recognition of felt and musically communicated emotions using electroencephalogram (EEG), electrocardiogram (ECG), and acoustic signals, which were recorded from eleven musicians instructed to perform music in order to communicate happiness, sadness, relaxation, and anger. Musicians' self-reports indicated that the emotions they musically expressed were highly consistent with those they actually felt. Results showed that the best classification performances, in a subject-dependent classification using a KNN classifier were achieved by using features derived from both the EEG and ECG (with an accuracy of 98.11%). Which was significantly more accurate than using ECG features alone, but was not significantly more accurate than using EEG features alone. The use of acoustic features alone or in combination with EEG and/or ECG features did not lead to better performances than those achieved with EEG plus ECG or EEG alone. Our results suggest that emotion detection of playing musicians, both felt and musically communicated, when coherent, can be classified in a more reliable way using physiological features than involving acoustic features. The reported machine learning results are a step toward the development of affective brain–computer interfaces capable of automatically inferring the emotions of a playing musician in real-time.","PeriodicalId":48916,"journal":{"name":"IEEE Transactions on Human-Machine Systems","volume":null,"pages":null},"PeriodicalIF":3.5000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10620218","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Human-Machine Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10620218/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
This article investigated the automatic recognition of felt and musically communicated emotions using electroencephalogram (EEG), electrocardiogram (ECG), and acoustic signals, which were recorded from eleven musicians instructed to perform music in order to communicate happiness, sadness, relaxation, and anger. Musicians' self-reports indicated that the emotions they musically expressed were highly consistent with those they actually felt. Results showed that the best classification performances, in a subject-dependent classification using a KNN classifier were achieved by using features derived from both the EEG and ECG (with an accuracy of 98.11%). Which was significantly more accurate than using ECG features alone, but was not significantly more accurate than using EEG features alone. The use of acoustic features alone or in combination with EEG and/or ECG features did not lead to better performances than those achieved with EEG plus ECG or EEG alone. Our results suggest that emotion detection of playing musicians, both felt and musically communicated, when coherent, can be classified in a more reliable way using physiological features than involving acoustic features. The reported machine learning results are a step toward the development of affective brain–computer interfaces capable of automatically inferring the emotions of a playing musician in real-time.
期刊介绍:
The scope of the IEEE Transactions on Human-Machine Systems includes the fields of human machine systems. It covers human systems and human organizational interactions including cognitive ergonomics, system test and evaluation, and human information processing concerns in systems and organizations.