通过具有时间对齐机制的判别相关融合从多模式生理信号中识别情绪。

IF 9.4 1区 计算机科学 Q1 AUTOMATION & CONTROL SYSTEMS IEEE Transactions on Cybernetics Pub Date : 2023-10-20 DOI:10.1109/TCYB.2023.3320107
Kechen Hou;Xiaowei Zhang;Yikun Yang;Qiqi Zhao;Wenjie Yuan;Zhongyi Zhou;Sipo Zhang;Chen Li;Jian Shen;Bin Hu
{"title":"通过具有时间对齐机制的判别相关融合从多模式生理信号中识别情绪。","authors":"Kechen Hou;Xiaowei Zhang;Yikun Yang;Qiqi Zhao;Wenjie Yuan;Zhongyi Zhou;Sipo Zhang;Chen Li;Jian Shen;Bin Hu","doi":"10.1109/TCYB.2023.3320107","DOIUrl":null,"url":null,"abstract":"Modeling correlations between multimodal physiological signals [e.g., canonical correlation analysis (CCA)] for emotion recognition has attracted much attention. However, existing studies rarely consider the neural nature of emotional responses within physiological signals. Furthermore, during fusion space construction, the CCA method maximizes only the correlations between different modalities and neglects the discriminative information of different emotional states. Most importantly, temporal mismatches between different neural activities are often ignored; therefore, the theoretical assumptions that multimodal data should be aligned in time and space before fusion are not fulfilled. To address these issues, we propose a discriminative correlation fusion method coupled with a temporal alignment mechanism for multimodal physiological signals. We first use neural signal analysis techniques to construct neural representations of the central nervous system (CNS) and autonomic nervous system (ANS). respectively. Then, emotion class labels are introduced in CCA to obtain more discriminative fusion representations from multimodal neural responses, and the temporal alignment between the CNS and ANS is jointly optimized with a fusion procedure that applies the Bayesian algorithm. The experimental results demonstrate that our method significantly improves the emotion recognition performance. Additionally, we show that this fusion method can model the underlying mechanisms in human nervous systems during emotional responses, and our results are consistent with prior findings. This study may guide a new approach for exploring human cognitive function based on physiological signals at different time scales and promote the development of computational intelligence and harmonious human–computer interactions.","PeriodicalId":13112,"journal":{"name":"IEEE Transactions on Cybernetics","volume":"54 5","pages":"3079-3092"},"PeriodicalIF":9.4000,"publicationDate":"2023-10-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Emotion Recognition From Multimodal Physiological Signals via Discriminative Correlation Fusion With a Temporal Alignment Mechanism\",\"authors\":\"Kechen Hou;Xiaowei Zhang;Yikun Yang;Qiqi Zhao;Wenjie Yuan;Zhongyi Zhou;Sipo Zhang;Chen Li;Jian Shen;Bin Hu\",\"doi\":\"10.1109/TCYB.2023.3320107\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Modeling correlations between multimodal physiological signals [e.g., canonical correlation analysis (CCA)] for emotion recognition has attracted much attention. However, existing studies rarely consider the neural nature of emotional responses within physiological signals. Furthermore, during fusion space construction, the CCA method maximizes only the correlations between different modalities and neglects the discriminative information of different emotional states. Most importantly, temporal mismatches between different neural activities are often ignored; therefore, the theoretical assumptions that multimodal data should be aligned in time and space before fusion are not fulfilled. To address these issues, we propose a discriminative correlation fusion method coupled with a temporal alignment mechanism for multimodal physiological signals. We first use neural signal analysis techniques to construct neural representations of the central nervous system (CNS) and autonomic nervous system (ANS). respectively. Then, emotion class labels are introduced in CCA to obtain more discriminative fusion representations from multimodal neural responses, and the temporal alignment between the CNS and ANS is jointly optimized with a fusion procedure that applies the Bayesian algorithm. The experimental results demonstrate that our method significantly improves the emotion recognition performance. Additionally, we show that this fusion method can model the underlying mechanisms in human nervous systems during emotional responses, and our results are consistent with prior findings. This study may guide a new approach for exploring human cognitive function based on physiological signals at different time scales and promote the development of computational intelligence and harmonious human–computer interactions.\",\"PeriodicalId\":13112,\"journal\":{\"name\":\"IEEE Transactions on Cybernetics\",\"volume\":\"54 5\",\"pages\":\"3079-3092\"},\"PeriodicalIF\":9.4000,\"publicationDate\":\"2023-10-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Cybernetics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10288375/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Cybernetics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10288375/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

建模多模式生理信号之间的相关性,例如,用于情绪识别的典型相关分析(CCA),已经引起了很多关注。然而,现有的研究很少考虑生理信号中情绪反应的神经性质。此外,在融合空间构建过程中,CCA方法只最大化了不同模态之间的相关性,而忽略了不同情绪状态的判别信息。最重要的是,不同神经活动之间的时间不匹配往往被忽视;因此,在融合之前,多模态数据应该在时间和空间上对齐的理论假设没有得到满足。为了解决这些问题,我们提出了一种与多模式生理信号的时间对齐机制相结合的判别相关融合方法。我们首先使用神经信号分析技术来构建中枢神经系统(CNS)和自主神经系统(ANS)的神经表征。分别地然后,在CCA中引入情绪类别标签,以从多模式神经反应中获得更具鉴别性的融合表示,并通过应用贝叶斯算法的融合程序联合优化CNS和ANS之间的时间对齐。实验结果表明,该方法显著提高了情绪识别性能。此外,我们还表明,这种融合方法可以模拟情绪反应过程中人类神经系统的潜在机制,我们的结果与先前的发现一致。这项研究可能为探索基于不同时间尺度的生理信号的人类认知功能提供一种新的方法,并促进计算智能和和谐人机交互的发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Emotion Recognition From Multimodal Physiological Signals via Discriminative Correlation Fusion With a Temporal Alignment Mechanism
Modeling correlations between multimodal physiological signals [e.g., canonical correlation analysis (CCA)] for emotion recognition has attracted much attention. However, existing studies rarely consider the neural nature of emotional responses within physiological signals. Furthermore, during fusion space construction, the CCA method maximizes only the correlations between different modalities and neglects the discriminative information of different emotional states. Most importantly, temporal mismatches between different neural activities are often ignored; therefore, the theoretical assumptions that multimodal data should be aligned in time and space before fusion are not fulfilled. To address these issues, we propose a discriminative correlation fusion method coupled with a temporal alignment mechanism for multimodal physiological signals. We first use neural signal analysis techniques to construct neural representations of the central nervous system (CNS) and autonomic nervous system (ANS). respectively. Then, emotion class labels are introduced in CCA to obtain more discriminative fusion representations from multimodal neural responses, and the temporal alignment between the CNS and ANS is jointly optimized with a fusion procedure that applies the Bayesian algorithm. The experimental results demonstrate that our method significantly improves the emotion recognition performance. Additionally, we show that this fusion method can model the underlying mechanisms in human nervous systems during emotional responses, and our results are consistent with prior findings. This study may guide a new approach for exploring human cognitive function based on physiological signals at different time scales and promote the development of computational intelligence and harmonious human–computer interactions.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Cybernetics
IEEE Transactions on Cybernetics COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
25.40
自引率
11.00%
发文量
1869
期刊介绍: The scope of the IEEE Transactions on Cybernetics includes computational approaches to the field of cybernetics. Specifically, the transactions welcomes papers on communication and control across machines or machine, human, and organizations. The scope includes such areas as computational intelligence, computer vision, neural networks, genetic algorithms, machine learning, fuzzy systems, cognitive systems, decision making, and robotics, to the extent that they contribute to the theme of cybernetics or demonstrate an application of cybernetics principles.
期刊最新文献
Visual-Inertial-Acoustic Sensor Fusion for Accurate Autonomous Localization of Underwater Vehicles Aeroengine Bearing Time-Varying Skidding Assessment With Prior Knowledge-Embedded Dual Feedback Spatial-Temporal GCN Interval Secure Event-Triggered Mechanism for Load Frequency Control Active Defense Against DoS Attack Bayesian Transfer Filtering Using Pseudo Marginal Measurement Likelihood Granular Computing for Machine Learning: Pursuing New Development Horizons
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1