{"title":"A study on Frequency Domain Microstate Feature Fusion for EEG Emotion Recognition","authors":"Di Xiao, Zhao Lv, Shiang Hu","doi":"10.1109/CCISP55629.2022.9974553","DOIUrl":null,"url":null,"abstract":"The microstate analysis of EEG signals makes full use of the spatial information of the brain topographic map, and reflects the active association of different brain regions. Different from the traditional EEG features that mostly focus on single-channel information, the microstate feature contains the spatio-temporal information of EEG signals. Unlike microstate studies that mostly focus on dimensional emotions, the experiments classify positive, neutral, and negative discrete emotions using the SEED database. This work filters the data of a single subject into five frequency bands and calculates the microstate topographic maps of EEG signals in different frequency bands, respectively. The extracted features of microstate classes are coverage, duration, occurrence, and transition probability between microstates. The gender difference as to the dominant microstate pattern for emotions and the comparison between microstates, we found that the brain activity of males in three emotional states and females in negative emotions were related to the frontal-occipital pattern, the females of positive and neutral emotional states were associated with the left and right brain areas. We also investigated the traditional power spectra features, these features which be fused over frequency bands or not fused were fed into the classifiers such as the K-Nearest Neighbor (KNN) and the the Support Vector Machine(SVM) to classify discrete emotional labels in SEED. The average classification accuracy of 15 subjects was 97.67±1.4% and 92.58±3.24%, respectively.","PeriodicalId":431851,"journal":{"name":"2022 7th International Conference on Communication, Image and Signal Processing (CCISP)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Communication, Image and Signal Processing (CCISP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCISP55629.2022.9974553","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The microstate analysis of EEG signals makes full use of the spatial information of the brain topographic map, and reflects the active association of different brain regions. Different from the traditional EEG features that mostly focus on single-channel information, the microstate feature contains the spatio-temporal information of EEG signals. Unlike microstate studies that mostly focus on dimensional emotions, the experiments classify positive, neutral, and negative discrete emotions using the SEED database. This work filters the data of a single subject into five frequency bands and calculates the microstate topographic maps of EEG signals in different frequency bands, respectively. The extracted features of microstate classes are coverage, duration, occurrence, and transition probability between microstates. The gender difference as to the dominant microstate pattern for emotions and the comparison between microstates, we found that the brain activity of males in three emotional states and females in negative emotions were related to the frontal-occipital pattern, the females of positive and neutral emotional states were associated with the left and right brain areas. We also investigated the traditional power spectra features, these features which be fused over frequency bands or not fused were fed into the classifiers such as the K-Nearest Neighbor (KNN) and the the Support Vector Machine(SVM) to classify discrete emotional labels in SEED. The average classification accuracy of 15 subjects was 97.67±1.4% and 92.58±3.24%, respectively.