TT-GCN: Temporal-Tightly Graph Convolutional Network for Emotion Recognition From Gaits

IF 4.5 2区 计算机科学 Q1 COMPUTER SCIENCE, CYBERNETICS IEEE Transactions on Computational Social Systems Pub Date : 2024-03-20 DOI:10.1109/TCSS.2024.3364378
Tong Zhang;Yelin Chen;Shuzhen Li;Xiping Hu;C. L. Philip Chen
{"title":"TT-GCN: Temporal-Tightly Graph Convolutional Network for Emotion Recognition From Gaits","authors":"Tong Zhang;Yelin Chen;Shuzhen Li;Xiping Hu;C. L. Philip Chen","doi":"10.1109/TCSS.2024.3364378","DOIUrl":null,"url":null,"abstract":"The human gait reflects substantial information about individual emotions. Current gait emotion recognition methods focus on capturing gait topology information and ignore the importance of fine-grained temporal features. This article proposes the temporal-tightly graph convolutional network (TT-GCN) to extract temporal features. TT-GCN comprises three significant mechanisms: the causal temporal convolution network (casual-TCN), the walking direction recognition auxiliary task, and the feature mapping layer. To obtain tight temporal dependencies and enhance the relevance among gait periods, the causal-TCN is introduced. Based on the assumption of emotional consistency in the walking directions, the auxiliary task is proposed to enhance the ability of fine-grained feature extraction. Through the feature mapping layer, affective features can be mapped into the appropriate representation and fused with deep learning features. TT-GCN shows the best performance across five comprehensive metrics. All experimental results verify the necessity and feasibility of exploring fine-grained temporal feature extraction.","PeriodicalId":13044,"journal":{"name":"IEEE Transactions on Computational Social Systems","volume":null,"pages":null},"PeriodicalIF":4.5000,"publicationDate":"2024-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Computational Social Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10476603/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, CYBERNETICS","Score":null,"Total":0}
引用次数: 0

Abstract

The human gait reflects substantial information about individual emotions. Current gait emotion recognition methods focus on capturing gait topology information and ignore the importance of fine-grained temporal features. This article proposes the temporal-tightly graph convolutional network (TT-GCN) to extract temporal features. TT-GCN comprises three significant mechanisms: the causal temporal convolution network (casual-TCN), the walking direction recognition auxiliary task, and the feature mapping layer. To obtain tight temporal dependencies and enhance the relevance among gait periods, the causal-TCN is introduced. Based on the assumption of emotional consistency in the walking directions, the auxiliary task is proposed to enhance the ability of fine-grained feature extraction. Through the feature mapping layer, affective features can be mapped into the appropriate representation and fused with deep learning features. TT-GCN shows the best performance across five comprehensive metrics. All experimental results verify the necessity and feasibility of exploring fine-grained temporal feature extraction.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
TT-GCN:用于从步态识别情绪的时空紧密图卷积网络
人类步态反映了个体情绪的大量信息。目前的步态情绪识别方法侧重于捕捉步态拓扑信息,而忽视了细粒度时态特征的重要性。本文提出了时序-紧密图卷积网络(TT-GCN)来提取时序特征。TT-GCN 包括三个重要机制:因果时空卷积网络(casual-TCN)、行走方向识别辅助任务和特征映射层。为了获得紧密的时间依赖性并增强步态周期之间的相关性,引入了因果时空卷积网络。基于行走方向的情感一致性假设,提出了辅助任务,以增强细粒度特征提取的能力。通过特征映射层,情感特征可以被映射到适当的表征中,并与深度学习特征融合。TT-GCN 在五个综合指标中表现最佳。所有实验结果都验证了探索细粒度时态特征提取的必要性和可行性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Computational Social Systems
IEEE Transactions on Computational Social Systems Social Sciences-Social Sciences (miscellaneous)
CiteScore
10.00
自引率
20.00%
发文量
316
期刊介绍: IEEE Transactions on Computational Social Systems focuses on such topics as modeling, simulation, analysis and understanding of social systems from the quantitative and/or computational perspective. "Systems" include man-man, man-machine and machine-machine organizations and adversarial situations as well as social media structures and their dynamics. More specifically, the proposed transactions publishes articles on modeling the dynamics of social systems, methodologies for incorporating and representing socio-cultural and behavioral aspects in computational modeling, analysis of social system behavior and structure, and paradigms for social systems modeling and simulation. The journal also features articles on social network dynamics, social intelligence and cognition, social systems design and architectures, socio-cultural modeling and representation, and computational behavior modeling, and their applications.
期刊最新文献
Table of Contents Guest Editorial: Special Issue on Dark Side of the Socio-Cyber World: Media Manipulation, Fake News, and Misinformation IEEE Transactions on Computational Social Systems Publication Information IEEE Transactions on Computational Social Systems Information for Authors IEEE Systems, Man, and Cybernetics Society Information
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1