{"title":"Emotion Recognition by Learning the Manifold of Fused Multiscale Information of EEG Signals","authors":"Cunbo Li;Shuhan Zhang;Yufeng Mu;Lei Yang;Yueheng Peng;Fali Li;Yangsong Zhang;Zhen Liang;Zehong Cao;Feng Wan;Dezhong Yao;Peiyang Li;Peng Xu","doi":"10.1109/TAFFC.2025.3555226","DOIUrl":null,"url":null,"abstract":"Recent research has consistently indicated that the fusion of electroencephalography (EEG) features from multiple modalities can integrate cognitive state expressions across diverse dimensions, resulting in a substantial increase in emotion recognition accuracy. However, redundant information within the fused multimodal features could lead to the curse of dimensionality and overfitting of the learning model. In this work, we propose a multiscale EEG feature fusion and representation strategy for EEG emotion recognition named manifold of multiscale information fusion (MMIF), in which the optimal manifold of the multiscale fusion of local and global brain activation patterns can be automatically learned to realize an efficient representation of emotional EEG signals. To evaluate the performance, in this work, both off- and online EEG emotion recognition experiments were conducted, and the experimental results consistently verified the effectiveness and feasibility of the MMIF applied in real-time emotion decoding systems. Furthermore, the analytical experiments confirmed the discriminative capabilities and cognitive interpretability of the MMIF. In summary, the proposed MMIF model may provide an efficient avenue for exploring representations and enhancing the discrimination of multimodal fusion features, which may also provide a promising solution for designing online affective braincomputer interaction systems.","PeriodicalId":13131,"journal":{"name":"IEEE Transactions on Affective Computing","volume":"16 3","pages":"2172-2188"},"PeriodicalIF":9.8000,"publicationDate":"2025-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Affective Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10943131/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Recent research has consistently indicated that the fusion of electroencephalography (EEG) features from multiple modalities can integrate cognitive state expressions across diverse dimensions, resulting in a substantial increase in emotion recognition accuracy. However, redundant information within the fused multimodal features could lead to the curse of dimensionality and overfitting of the learning model. In this work, we propose a multiscale EEG feature fusion and representation strategy for EEG emotion recognition named manifold of multiscale information fusion (MMIF), in which the optimal manifold of the multiscale fusion of local and global brain activation patterns can be automatically learned to realize an efficient representation of emotional EEG signals. To evaluate the performance, in this work, both off- and online EEG emotion recognition experiments were conducted, and the experimental results consistently verified the effectiveness and feasibility of the MMIF applied in real-time emotion decoding systems. Furthermore, the analytical experiments confirmed the discriminative capabilities and cognitive interpretability of the MMIF. In summary, the proposed MMIF model may provide an efficient avenue for exploring representations and enhancing the discrimination of multimodal fusion features, which may also provide a promising solution for designing online affective braincomputer interaction systems.
期刊介绍:
The IEEE Transactions on Affective Computing is an international and interdisciplinary journal. Its primary goal is to share research findings on the development of systems capable of recognizing, interpreting, and simulating human emotions and related affective phenomena. The journal publishes original research on the underlying principles and theories that explain how and why affective factors shape human-technology interactions. It also focuses on how techniques for sensing and simulating affect can enhance our understanding of human emotions and processes. Additionally, the journal explores the design, implementation, and evaluation of systems that prioritize the consideration of affect in their usability. We also welcome surveys of existing work that provide new perspectives on the historical and future directions of this field.