Emotion Recognition Using Continuous Wavelet Transform and Ensemble of Convolutional Neural Networks through Transfer Learning from Electroencephalogram Signal
S. Bagherzadeh, K. Maghooli, Ahmad Shalbaf, Arash Maghsoudi
{"title":"Emotion Recognition Using Continuous Wavelet Transform and Ensemble of Convolutional Neural Networks through Transfer Learning from Electroencephalogram Signal","authors":"S. Bagherzadeh, K. Maghooli, Ahmad Shalbaf, Arash Maghsoudi","doi":"10.18502/fbt.v10i1.11512","DOIUrl":null,"url":null,"abstract":"Purpose: Emotions are integral brain states that can influence our behavior, decision-making, and functions. Electroencephalogram (EEG) is an appropriate modality for emotion recognition since it has high temporal resolution and is a non-invasive and cheap technique. \nMaterials and Methods: A novel approach based on Ensemble pre-trained Convolutional Neural Networks (ECNNs) is proposed to recognize four emotional classes from EEG channels of individuals watching music video clips. First, scalograms are built from one-dimensional EEG signals by applying the Continuous Wavelet Transform (CWT) method. Then, these images are used to re-train five CNNs: AlexNet, VGG-19, Inception-v1, ResNet-18, and Inception-v3. Then, the majority voting method is applied to make the final decision about emotional classes. The 10-fold cross-validation method is used to evaluate the performance of the proposed method on EEG signals of 32 subjects from the DEAP database. \nResults:.The experiments showed that applying the proposed ensemble approach in combinations of scalograms of frontal and parietal regions improved results. The best accuracy, sensitivity, precision, and F-score to recognize four emotional states achieved 96.90% ± 0.52, 97.30 ± 0.55, 96.97 ± 0.62, and 96.74 ± 0.56, respectively. \nConclusion: So, the newly proposed model from EEG signals improves recognition of the four emotional states in the DEAP database.","PeriodicalId":34203,"journal":{"name":"Frontiers in Biomedical Technologies","volume":"3 2 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Biomedical Technologies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.18502/fbt.v10i1.11512","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Health Professions","Score":null,"Total":0}
引用次数: 0
Abstract
Purpose: Emotions are integral brain states that can influence our behavior, decision-making, and functions. Electroencephalogram (EEG) is an appropriate modality for emotion recognition since it has high temporal resolution and is a non-invasive and cheap technique.
Materials and Methods: A novel approach based on Ensemble pre-trained Convolutional Neural Networks (ECNNs) is proposed to recognize four emotional classes from EEG channels of individuals watching music video clips. First, scalograms are built from one-dimensional EEG signals by applying the Continuous Wavelet Transform (CWT) method. Then, these images are used to re-train five CNNs: AlexNet, VGG-19, Inception-v1, ResNet-18, and Inception-v3. Then, the majority voting method is applied to make the final decision about emotional classes. The 10-fold cross-validation method is used to evaluate the performance of the proposed method on EEG signals of 32 subjects from the DEAP database.
Results:.The experiments showed that applying the proposed ensemble approach in combinations of scalograms of frontal and parietal regions improved results. The best accuracy, sensitivity, precision, and F-score to recognize four emotional states achieved 96.90% ± 0.52, 97.30 ± 0.55, 96.97 ± 0.62, and 96.74 ± 0.56, respectively.
Conclusion: So, the newly proposed model from EEG signals improves recognition of the four emotional states in the DEAP database.