ER-Net: Efficient Recalibration Network for Multi-View Multi-Person 3D Pose Estimation

IF 2.2 4区 工程技术 Q2 ENGINEERING, MULTIDISCIPLINARY Cmes-computer Modeling in Engineering & Sciences Pub Date : 2023-01-01 DOI:10.32604/cmes.2023.024189
Mi Zhou, Rui Liu, Pengfei Yi, Dongsheng Zhou
{"title":"ER-Net: Efficient Recalibration Network for Multi-View Multi-Person 3D Pose Estimation","authors":"Mi Zhou, Rui Liu, Pengfei Yi, Dongsheng Zhou","doi":"10.32604/cmes.2023.024189","DOIUrl":null,"url":null,"abstract":"Multi-view multi-person 3D human pose estimation is a hot topic in the field of human pose estimation due to its wide range of application scenarios. With the introduction of end-to-end direct regression methods, the field has entered a new stage of development. However, the regression results of joints that are more heavily influenced by external factors are not accurate enough even for the optimal method. In this paper, we propose an effective feature recalibration module based on the channel attention mechanism and a relative optimal calibration strategy, which is applied to the multi-view multi-person 3D human pose estimation task to achieve improved detection accuracy for joints that are more severely affected by external factors. Specifically, it achieves relative optimal weight adjustment of joint feature information through the recalibration module and strategy, which enables the model to learn the dependencies between joints and the dependencies between people and their corresponding joints. We call this method as the Efficient Recalibration Network (ER-Net). Finally, experiments were conducted on two benchmark datasets for this task, Campus and Shelf, in which the PCP reached 97.3% and 98.3%, respectively.","PeriodicalId":10451,"journal":{"name":"Cmes-computer Modeling in Engineering & Sciences","volume":"16 1","pages":"0"},"PeriodicalIF":2.2000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cmes-computer Modeling in Engineering & Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.32604/cmes.2023.024189","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 1

Abstract

Multi-view multi-person 3D human pose estimation is a hot topic in the field of human pose estimation due to its wide range of application scenarios. With the introduction of end-to-end direct regression methods, the field has entered a new stage of development. However, the regression results of joints that are more heavily influenced by external factors are not accurate enough even for the optimal method. In this paper, we propose an effective feature recalibration module based on the channel attention mechanism and a relative optimal calibration strategy, which is applied to the multi-view multi-person 3D human pose estimation task to achieve improved detection accuracy for joints that are more severely affected by external factors. Specifically, it achieves relative optimal weight adjustment of joint feature information through the recalibration module and strategy, which enables the model to learn the dependencies between joints and the dependencies between people and their corresponding joints. We call this method as the Efficient Recalibration Network (ER-Net). Finally, experiments were conducted on two benchmark datasets for this task, Campus and Shelf, in which the PCP reached 97.3% and 98.3%, respectively.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ER-Net:多视角多人三维姿态估计的高效再标定网络
多视角多人三维人体姿态估计因其广泛的应用场景而成为人体姿态估计领域的研究热点。随着端到端直接回归方法的引入,该领域进入了一个新的发展阶段。然而,对于受外部因素影响较大的关节,即使采用最优方法,其回归结果也不够准确。本文提出了一种有效的基于通道注意机制的特征再校准模块和一种相对最优的校准策略,并将其应用于多视角多人三维人体姿态估计任务中,以提高受外界因素影响较大的关节的检测精度。具体来说,通过重标定模块和策略实现对关节特征信息的相对最优权值调整,使模型能够学习到关节之间的依赖关系以及人与其对应关节之间的依赖关系。我们把这种方法称为高效再校准网络(ER-Net)。最后,在Campus和Shelf两个基准数据集上进行实验,PCP分别达到97.3%和98.3%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Cmes-computer Modeling in Engineering & Sciences
Cmes-computer Modeling in Engineering & Sciences ENGINEERING, MULTIDISCIPLINARY-MATHEMATICS, INTERDISCIPLINARY APPLICATIONS
CiteScore
3.80
自引率
16.70%
发文量
298
审稿时长
7.8 months
期刊介绍: This journal publishes original research papers of reasonable permanent value, in the areas of computational mechanics, computational physics, computational chemistry, and computational biology, pertinent to solids, fluids, gases, biomaterials, and other continua. Various length scales (quantum, nano, micro, meso, and macro), and various time scales ( picoseconds to hours) are of interest. Papers which deal with multi-physics problems, as well as those which deal with the interfaces of mechanics, chemistry, and biology, are particularly encouraged. New computational approaches, and more efficient algorithms, which eventually make near-real-time computations possible, are welcome. Original papers dealing with new methods such as meshless methods, and mesh-reduction methods are sought.
期刊最新文献
ThyroidNet: A Deep Learning Network for Localization and Classification of Thyroid Nodules. A Novel SE-CNN Attention Architecture for sEMG-Based Hand Gesture Recognition ER-Net: Efficient Recalibration Network for Multi-View Multi-Person 3D Pose Estimation Anomaly Detection of UAV State Data Based on Single-Class Triangular Global Alignment Kernel Extreme Learning Machine Introduction to the Special Issue on Computational Intelligent Systems for Solving Complex Engineering Problems: Principles and Applications
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1