RCTrans-Net:利用超宽带雷达快速探测墙后人体的时空模型

IF 4 3区 计算机科学 Q1 COMPUTER SCIENCE, HARDWARE & ARCHITECTURE Computers & Electrical Engineering Pub Date : 2024-11-20 DOI:10.1016/j.compeleceng.2024.109873
Cries Avian , Jenq-Shiou Leu , Hang Song , Jun-ichi Takada , Nur Achmad Sulistyo Putro , Muhammad Izzuddin Mahali , Setya Widyawan Prakosa
{"title":"RCTrans-Net:利用超宽带雷达快速探测墙后人体的时空模型","authors":"Cries Avian ,&nbsp;Jenq-Shiou Leu ,&nbsp;Hang Song ,&nbsp;Jun-ichi Takada ,&nbsp;Nur Achmad Sulistyo Putro ,&nbsp;Muhammad Izzuddin Mahali ,&nbsp;Setya Widyawan Prakosa","doi":"10.1016/j.compeleceng.2024.109873","DOIUrl":null,"url":null,"abstract":"<div><div>Ultrawideband (UWB) radar systems are becoming increasingly popular for detecting human presence, even through walls. Recent advancements in signal processing use deep learning techniques, which are known for their accuracy. While earlier methods focused on spatial information using Convolutional Neural Networks (CNNs), newer research highlights the importance of temporal information, such as how data peaks shift over time. This study introduces RCTrans-Net, a deep-learning architecture that combines RCNet (a Residual CNN) for spatial features with TransNet (a Transformer) for temporal features. This fusion improves human presence classification in fast-time signal processing. Tested under various conditions—different materials, body orientations, ranges, and radar heights—RCTrans-Net achieved high performance with F1-scores of 0.997±0.000 for static, 0.967±0.004 for dynamic, and 0.978±0.001 for combined scenarios. The architecture outperforms previous methods and offers real-time processing with an inference time of about one millisecond.</div></div>","PeriodicalId":50630,"journal":{"name":"Computers & Electrical Engineering","volume":"120 ","pages":"Article 109873"},"PeriodicalIF":4.0000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"RCTrans-Net: A spatiotemporal model for fast-time human detection behind walls using ultrawideband radar\",\"authors\":\"Cries Avian ,&nbsp;Jenq-Shiou Leu ,&nbsp;Hang Song ,&nbsp;Jun-ichi Takada ,&nbsp;Nur Achmad Sulistyo Putro ,&nbsp;Muhammad Izzuddin Mahali ,&nbsp;Setya Widyawan Prakosa\",\"doi\":\"10.1016/j.compeleceng.2024.109873\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Ultrawideband (UWB) radar systems are becoming increasingly popular for detecting human presence, even through walls. Recent advancements in signal processing use deep learning techniques, which are known for their accuracy. While earlier methods focused on spatial information using Convolutional Neural Networks (CNNs), newer research highlights the importance of temporal information, such as how data peaks shift over time. This study introduces RCTrans-Net, a deep-learning architecture that combines RCNet (a Residual CNN) for spatial features with TransNet (a Transformer) for temporal features. This fusion improves human presence classification in fast-time signal processing. Tested under various conditions—different materials, body orientations, ranges, and radar heights—RCTrans-Net achieved high performance with F1-scores of 0.997±0.000 for static, 0.967±0.004 for dynamic, and 0.978±0.001 for combined scenarios. The architecture outperforms previous methods and offers real-time processing with an inference time of about one millisecond.</div></div>\",\"PeriodicalId\":50630,\"journal\":{\"name\":\"Computers & Electrical Engineering\",\"volume\":\"120 \",\"pages\":\"Article 109873\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2024-11-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers & Electrical Engineering\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0045790624007997\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Electrical Engineering","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0045790624007997","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
引用次数: 0

摘要

超宽带(UWB)雷达系统在探测人的存在方面越来越受欢迎,甚至可以隔墙探测。信号处理领域的最新进展采用了以准确性著称的深度学习技术。早期的方法侧重于使用卷积神经网络(CNN)来处理空间信息,而最新的研究则强调了时间信息的重要性,例如数据峰值如何随时间变化。本研究介绍了 RCTrans-Net,这是一种深度学习架构,它将用于空间特征的 RCNet(残差 CNN)与用于时间特征的 TransNet(变换器)结合在一起。这种融合改进了快速时间信号处理中的人类存在分类。在各种条件下(不同材料、身体方向、射程和雷达高度)进行测试后,RCTrans-Net 取得了很高的性能,静态 F1 分数为 0.997±0.000,动态为 0.967±0.004,综合场景为 0.978±0.001。该架构优于之前的方法,并能提供实时处理,推理时间约为一毫秒。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
RCTrans-Net: A spatiotemporal model for fast-time human detection behind walls using ultrawideband radar
Ultrawideband (UWB) radar systems are becoming increasingly popular for detecting human presence, even through walls. Recent advancements in signal processing use deep learning techniques, which are known for their accuracy. While earlier methods focused on spatial information using Convolutional Neural Networks (CNNs), newer research highlights the importance of temporal information, such as how data peaks shift over time. This study introduces RCTrans-Net, a deep-learning architecture that combines RCNet (a Residual CNN) for spatial features with TransNet (a Transformer) for temporal features. This fusion improves human presence classification in fast-time signal processing. Tested under various conditions—different materials, body orientations, ranges, and radar heights—RCTrans-Net achieved high performance with F1-scores of 0.997±0.000 for static, 0.967±0.004 for dynamic, and 0.978±0.001 for combined scenarios. The architecture outperforms previous methods and offers real-time processing with an inference time of about one millisecond.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computers & Electrical Engineering
Computers & Electrical Engineering 工程技术-工程:电子与电气
CiteScore
9.20
自引率
7.00%
发文量
661
审稿时长
47 days
期刊介绍: The impact of computers has nowhere been more revolutionary than in electrical engineering. The design, analysis, and operation of electrical and electronic systems are now dominated by computers, a transformation that has been motivated by the natural ease of interface between computers and electrical systems, and the promise of spectacular improvements in speed and efficiency. Published since 1973, Computers & Electrical Engineering provides rapid publication of topical research into the integration of computer technology and computational techniques with electrical and electronic systems. The journal publishes papers featuring novel implementations of computers and computational techniques in areas like signal and image processing, high-performance computing, parallel processing, and communications. Special attention will be paid to papers describing innovative architectures, algorithms, and software tools.
期刊最新文献
Simplicial complexes using vector visibility graphs for multivariate classification of faults in electrical distribution systems Load balanced sub-tree decomposition algorithm for solving Mixed Integer Linear Programming models in behavioral synthesis A current optimization model predictive control with common-mode voltage reduction for three-level T-type inverters Enhanced brain tumor classification using convolutional neural networks and ensemble voting classifier for improved diagnostic accuracy SmartTrack: Sparse multiple objects association with selective re-identification tracking
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1