A lightweight Future Skeleton Generation Network(FSGN) based on spatio-temporal encoding and decoding

IF 7.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Knowledge-Based Systems Pub Date : 2024-11-08 DOI:10.1016/j.knosys.2024.112717
Tingyu Liu , Chenyi Weng , Jun Huang , Zhonghua Ni
{"title":"A lightweight Future Skeleton Generation Network(FSGN) based on spatio-temporal encoding and decoding","authors":"Tingyu Liu ,&nbsp;Chenyi Weng ,&nbsp;Jun Huang ,&nbsp;Zhonghua Ni","doi":"10.1016/j.knosys.2024.112717","DOIUrl":null,"url":null,"abstract":"<div><div>Since early warning in industrial applications is far more valuable than post-event analysis, human activity prediction based on partially observed skeleton sequences has become a popular research area. Recent studies focus on building complex deep learning networks to generate accurate future skeleton data, but overlook the requirement for timeliness. Different from such frame-by-frame generation methods, we propose a Future Skeleton Generation Network (FSGN) based on spatio-temporal encoding and decoding framework. Firstly, we design a dynamically regulated input module to ensure equal-length input of partially observed data, and set modules like discrete cosine transform(DCT) and low-pass filtering(LPF) to filter important information. Then, we employ an improved multi-layer perceptron(MLP) structure as the basic computational unit for the encoding and decoding framework to extract spatio-temporal information, and propose using multi-dimensional motion error of human skeleton to form the loss function. Finally, we use an output module symmetrical to the input module to achieve the generation of future activity data. Results show that the proposed FSGN achieves fewer parameters(0.12 M) and higher generation accuracy, which can effectively provide future information for human activity prediction tasks.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"306 ","pages":"Article 112717"},"PeriodicalIF":7.2000,"publicationDate":"2024-11-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705124013510","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Since early warning in industrial applications is far more valuable than post-event analysis, human activity prediction based on partially observed skeleton sequences has become a popular research area. Recent studies focus on building complex deep learning networks to generate accurate future skeleton data, but overlook the requirement for timeliness. Different from such frame-by-frame generation methods, we propose a Future Skeleton Generation Network (FSGN) based on spatio-temporal encoding and decoding framework. Firstly, we design a dynamically regulated input module to ensure equal-length input of partially observed data, and set modules like discrete cosine transform(DCT) and low-pass filtering(LPF) to filter important information. Then, we employ an improved multi-layer perceptron(MLP) structure as the basic computational unit for the encoding and decoding framework to extract spatio-temporal information, and propose using multi-dimensional motion error of human skeleton to form the loss function. Finally, we use an output module symmetrical to the input module to achieve the generation of future activity data. Results show that the proposed FSGN achieves fewer parameters(0.12 M) and higher generation accuracy, which can effectively provide future information for human activity prediction tasks.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于时空编码和解码的轻量级未来骨架生成网络(FSGN)
由于工业应用中的早期预警远比事后分析更有价值,因此基于部分观察到的骨骼序列进行人类活动预测已成为一个热门研究领域。最近的研究侧重于构建复杂的深度学习网络来生成准确的未来骨架数据,但忽略了对时效性的要求。与这种逐帧生成方法不同,我们提出了一种基于时空编码和解码框架的未来骨架生成网络(FSGN)。首先,我们设计了一个动态调节输入模块,以确保部分观测数据的等长输入,并设置离散余弦变换(DCT)和低通滤波(LPF)等模块来过滤重要信息。然后,我们采用改进的多层感知器(MLP)结构作为编码和解码框架的基本计算单元来提取时空信息,并提出利用人体骨骼的多维运动误差来形成损失函数。最后,我们使用与输入模块对称的输出模块来实现未来活动数据的生成。结果表明,所提出的 FSGN 实现了更少的参数(0.12 M)和更高的生成精度,可以有效地为人类活动预测任务提供未来信息。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
期刊最新文献
Progressive de-preference task-specific processing for generalizable person re-identification GKA-GPT: Graphical knowledge aggregation for multiturn dialog generation A novel spatio-temporal feature interleaved contrast learning neural network from a robustness perspective PSNet: A non-uniform illumination correction method for underwater images based pseudo-siamese network A novel domain-private-suppress meta-recognition network based universal domain generalization for machinery fault diagnosis
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1