Omnidirectional Human Motion Recognition With Monostatic Radar System Using Active Learning

IF 5.7 2区 计算机科学 Q1 ENGINEERING, AEROSPACE IEEE Transactions on Aerospace and Electronic Systems Pub Date : 2024-10-28 DOI:10.1109/TAES.2024.3487146
Zhengkang Zhou;Yang Yang;Beichen Li;Yue Lang
{"title":"Omnidirectional Human Motion Recognition With Monostatic Radar System Using Active Learning","authors":"Zhengkang Zhou;Yang Yang;Beichen Li;Yue Lang","doi":"10.1109/TAES.2024.3487146","DOIUrl":null,"url":null,"abstract":"‘‘Angle sensitivity” aggravates the difficulty in radar-based omnidirectional human motion recognition. This issue is addressed in earlier work by using omnidirectional radar data for training. However, this practice requires labor-intensive radar measurements and a time-consuming annotation process. Tackling this issue, this article first introduces the active learning technique for the radar-based omnidirectional recognition problem, and we present a hybrid-uncertainty active learning method, which significantly reduces the annotation expenses required to train an omnidirectional motion classifier. In the context of the complex motions and varying angles, we propose a pixelwise similarity assessment methodology in addition to semantic-based sampling. This approach is proven to alleviate the issue of “imbalanced sampling” in active learning significantly by rebalancing the selected samples across categories. Furthermore, a hybrid-uncertainty dimension is introduced to quantify the uncertainty of the unlabeled samples from both pixel and semantic levels. The dimension is evaluated through three perspectives, including the consistency factor, difficulty factor, and pixelwise similarity. The experimental results exhibit that our algorithm achieves a recognition accuracy of 76.06% using only 40% of labeled data, which is a mere decrease of only 0.15% compared to the accuracy achieved with 100% labeled data. Our approach surpasses six state-of-the-art active learning methods in solving the omnidirectional problem, and ablation studies confirm the efficacy of each component presented in our model.","PeriodicalId":13157,"journal":{"name":"IEEE Transactions on Aerospace and Electronic Systems","volume":"61 2","pages":"3456-3469"},"PeriodicalIF":5.7000,"publicationDate":"2024-10-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Aerospace and Electronic Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10736993/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, AEROSPACE","Score":null,"Total":0}
引用次数: 0

Abstract

‘‘Angle sensitivity” aggravates the difficulty in radar-based omnidirectional human motion recognition. This issue is addressed in earlier work by using omnidirectional radar data for training. However, this practice requires labor-intensive radar measurements and a time-consuming annotation process. Tackling this issue, this article first introduces the active learning technique for the radar-based omnidirectional recognition problem, and we present a hybrid-uncertainty active learning method, which significantly reduces the annotation expenses required to train an omnidirectional motion classifier. In the context of the complex motions and varying angles, we propose a pixelwise similarity assessment methodology in addition to semantic-based sampling. This approach is proven to alleviate the issue of “imbalanced sampling” in active learning significantly by rebalancing the selected samples across categories. Furthermore, a hybrid-uncertainty dimension is introduced to quantify the uncertainty of the unlabeled samples from both pixel and semantic levels. The dimension is evaluated through three perspectives, including the consistency factor, difficulty factor, and pixelwise similarity. The experimental results exhibit that our algorithm achieves a recognition accuracy of 76.06% using only 40% of labeled data, which is a mere decrease of only 0.15% compared to the accuracy achieved with 100% labeled data. Our approach surpasses six state-of-the-art active learning methods in solving the omnidirectional problem, and ablation studies confirm the efficacy of each component presented in our model.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用主动学习的单静态雷达系统进行全方位人体运动识别
“角度敏感”问题加剧了基于雷达的全方位人体运动识别的难度。这个问题在早期的工作中通过使用全向雷达数据进行训练来解决。然而,这种做法需要耗费大量劳动的雷达测量和耗时的注释过程。针对这一问题,本文首先介绍了基于雷达的全向识别问题的主动学习技术,并提出了一种混合不确定性主动学习方法,该方法显著降低了训练全向运动分类器所需的标注费用。在复杂运动和角度变化的背景下,我们提出了一种基于语义采样的像素相似度评估方法。这种方法被证明可以通过重新平衡所选样本来显著缓解主动学习中的“不平衡采样”问题。此外,引入混合不确定性维度,从像素和语义两个层面量化未标记样本的不确定性。维度通过三个角度进行评估,包括一致性因素、难度因素和像素相似性。实验结果表明,我们的算法在使用40%标记数据的情况下,识别准确率达到76.06%,与使用100%标记数据的准确率相比,仅下降了0.15%。我们的方法在解决全向问题方面超过了六种最先进的主动学习方法,消融研究证实了我们模型中每个组件的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
7.80
自引率
13.60%
发文量
433
审稿时长
8.7 months
期刊介绍: IEEE Transactions on Aerospace and Electronic Systems focuses on the organization, design, development, integration, and operation of complex systems for space, air, ocean, or ground environment. These systems include, but are not limited to, navigation, avionics, spacecraft, aerospace power, radar, sonar, telemetry, defense, transportation, automated testing, and command and control.
期刊最新文献
Self-Calibrating UAV Navigation: Reinforcement Learning Approaches for Horizontal Trajectory Estimation Cascaded Symmetry-Aware Network for Component Segmentation of Satellites Multi-Agent Inverse Reinforcement Learning for Radar-based Detection of Pareto-Efficient UAV Coordination Power Allocation for LED Arrays in Asynchronous Visible Light Positioning Systems Two-Degree-of-Freedom Compound Split Transmission Control for a Helicopter Powertrain
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1