Dual-Path CNN–BiLSTM for mmWave-Based Human Skeletal Pose Estimation

IF 4.3 2区 综合性期刊 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Sensors Journal Pub Date : 2025-02-27 DOI:10.1109/JSEN.2025.3543343
Yuqiang He;Jun Wang;Yaxin Li;Yuquan Luo
{"title":"Dual-Path CNN–BiLSTM for mmWave-Based Human Skeletal Pose Estimation","authors":"Yuqiang He;Jun Wang;Yaxin Li;Yuquan Luo","doi":"10.1109/JSEN.2025.3543343","DOIUrl":null,"url":null,"abstract":"In this article, we introduce a novel method for human skeletal joint localization using millimeter-wave (mmWave) radar, effectively overcoming the limitations of vision-based pose estimation methods, which are vulnerable to changes in lighting conditions and pose privacy concerns. The method leverages mmWave radar to generate 4-D time-series point cloud data, which is then projected onto the depth-azimuth and depth-elevation planes. This projection helps mitigate the sparsity inherent in traditional point cloud data and reduces the complexity of the machine learning model required for pose estimation. The input data structure is optimized using a sliding window technique, where consecutive frames are processed by a convolutional neural network (CNN) to extract spatial features. These features are then sorted chronologically and fed into a bi-directional long short-term memory (BiLSTM) to capture temporal features, resulting in the accurate localization of 25 skeletal joints. To validate the performance and effectiveness of the proposed method, we created a dataset comprising three body types and ten distinct actions. The experimental results demonstrate the method’s outstanding human pose estimation capability.","PeriodicalId":447,"journal":{"name":"IEEE Sensors Journal","volume":"25 7","pages":"11683-11696"},"PeriodicalIF":4.3000,"publicationDate":"2025-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Sensors Journal","FirstCategoryId":"103","ListUrlMain":"https://ieeexplore.ieee.org/document/10907798/","RegionNum":2,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

In this article, we introduce a novel method for human skeletal joint localization using millimeter-wave (mmWave) radar, effectively overcoming the limitations of vision-based pose estimation methods, which are vulnerable to changes in lighting conditions and pose privacy concerns. The method leverages mmWave radar to generate 4-D time-series point cloud data, which is then projected onto the depth-azimuth and depth-elevation planes. This projection helps mitigate the sparsity inherent in traditional point cloud data and reduces the complexity of the machine learning model required for pose estimation. The input data structure is optimized using a sliding window technique, where consecutive frames are processed by a convolutional neural network (CNN) to extract spatial features. These features are then sorted chronologically and fed into a bi-directional long short-term memory (BiLSTM) to capture temporal features, resulting in the accurate localization of 25 skeletal joints. To validate the performance and effectiveness of the proposed method, we created a dataset comprising three body types and ten distinct actions. The experimental results demonstrate the method’s outstanding human pose estimation capability.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于毫米波的人体骨骼姿态估计的双路径CNN-BiLSTM
在本文中,我们介绍了一种使用毫米波(mmWave)雷达进行人体骨骼关节定位的新方法,有效克服了基于视觉的姿态估计方法的局限性,这些方法容易受到光照条件变化的影响,并且存在隐私问题。该方法利用毫米波雷达生成4d时间序列点云数据,然后将其投影到深度-方位角和深度-高程平面上。这种投影有助于缓解传统点云数据固有的稀疏性,并降低姿态估计所需的机器学习模型的复杂性。使用滑动窗口技术优化输入数据结构,其中连续帧由卷积神经网络(CNN)处理以提取空间特征。然后将这些特征按时间顺序排序,并输入双向长短期记忆(BiLSTM)以捕获时间特征,从而精确定位25个骨骼关节。为了验证所提出方法的性能和有效性,我们创建了一个包含三种身体类型和十个不同动作的数据集。实验结果表明,该方法具有出色的人体姿态估计能力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Sensors Journal
IEEE Sensors Journal 工程技术-工程:电子与电气
CiteScore
7.70
自引率
14.00%
发文量
2058
审稿时长
5.2 months
期刊介绍: The fields of interest of the IEEE Sensors Journal are the theory, design , fabrication, manufacturing and applications of devices for sensing and transducing physical, chemical and biological phenomena, with emphasis on the electronics and physics aspect of sensors and integrated sensors-actuators. IEEE Sensors Journal deals with the following: -Sensor Phenomenology, Modelling, and Evaluation -Sensor Materials, Processing, and Fabrication -Chemical and Gas Sensors -Microfluidics and Biosensors -Optical Sensors -Physical Sensors: Temperature, Mechanical, Magnetic, and others -Acoustic and Ultrasonic Sensors -Sensor Packaging -Sensor Networks -Sensor Applications -Sensor Systems: Signals, Processing, and Interfaces -Actuators and Sensor Power Systems -Sensor Signal Processing for high precision and stability (amplification, filtering, linearization, modulation/demodulation) and under harsh conditions (EMC, radiation, humidity, temperature); energy consumption/harvesting -Sensor Data Processing (soft computing with sensor data, e.g., pattern recognition, machine learning, evolutionary computation; sensor data fusion, processing of wave e.g., electromagnetic and acoustic; and non-wave, e.g., chemical, gravity, particle, thermal, radiative and non-radiative sensor data, detection, estimation and classification based on sensor data) -Sensors in Industrial Practice
期刊最新文献
IEEE Sensors Council IEEE Sensors Council 2025 Reviewers List IEEE Sensors Council IEEE Sensors Council
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1