Dual-Channel BiGRU–Transformer-Graph Fusion Network for Space Micromotion Targets Recognition Based on Radar Network Systems

IF 5.7 2区 计算机科学 Q1 ENGINEERING, AEROSPACE IEEE Transactions on Aerospace and Electronic Systems Pub Date : 2025-04-03 DOI:10.1109/TAES.2025.3557396
Zhi-Hao Wang;Yuan-Peng Zhang;Kai-Ming Li;Ying Luo;Qun Zhang;Ling-Hua Su
{"title":"Dual-Channel BiGRU–Transformer-Graph Fusion Network for Space Micromotion Targets Recognition Based on Radar Network Systems","authors":"Zhi-Hao Wang;Yuan-Peng Zhang;Kai-Ming Li;Ying Luo;Qun Zhang;Ling-Hua Su","doi":"10.1109/TAES.2025.3557396","DOIUrl":null,"url":null,"abstract":"The radar network system (RNS) can provide multiband and multiview target information, which helps improve target recognition ability. A space micromotion targets recognition method based on RNSs with a dual-channel bidirectional gated recurrent unit (BiGRU)–Transformer-graph fusion (DC-BiGT-GF) network is proposed in this article. First, a temporal feature extraction subnetwork based on BiGRU–Transformer is utilized to process the real and imaginary parts of complex-valued radar cross section in parallel to capture the local and global temporal dependencies. Second, a spatial feature extraction subnetwork is designed to extract the potential spatial dependencies, which integrates a predefined graph and an adaptive graph. In the real part channel, the Euclidian distance between radars is used to construct the adjacency matrix to represent the predefined graph structure, and the adaptive adjacency matrix is designed to learn the potential graph structure from end to end. To represent the frequency-domain features, the phase difference is applied to the imaginary part channel to build a predefined adjacency matrix. Meanwhile, the adaptive adjacency matrix is calculated using cosine similarity to obtain the geometric features. Finally, extensive experiments show that the DC-BiGT-GF network can reliably recognize the space micromotion targets under low SNR and low radar pulse repetition frequency conditions. Recognition accuracy is greatly improved compared with the baseline methods.","PeriodicalId":13157,"journal":{"name":"IEEE Transactions on Aerospace and Electronic Systems","volume":"61 4","pages":"9812-9828"},"PeriodicalIF":5.7000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10949040","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Aerospace and Electronic Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10949040/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, AEROSPACE","Score":null,"Total":0}
引用次数: 0

Abstract

The radar network system (RNS) can provide multiband and multiview target information, which helps improve target recognition ability. A space micromotion targets recognition method based on RNSs with a dual-channel bidirectional gated recurrent unit (BiGRU)–Transformer-graph fusion (DC-BiGT-GF) network is proposed in this article. First, a temporal feature extraction subnetwork based on BiGRU–Transformer is utilized to process the real and imaginary parts of complex-valued radar cross section in parallel to capture the local and global temporal dependencies. Second, a spatial feature extraction subnetwork is designed to extract the potential spatial dependencies, which integrates a predefined graph and an adaptive graph. In the real part channel, the Euclidian distance between radars is used to construct the adjacency matrix to represent the predefined graph structure, and the adaptive adjacency matrix is designed to learn the potential graph structure from end to end. To represent the frequency-domain features, the phase difference is applied to the imaginary part channel to build a predefined adjacency matrix. Meanwhile, the adaptive adjacency matrix is calculated using cosine similarity to obtain the geometric features. Finally, extensive experiments show that the DC-BiGT-GF network can reliably recognize the space micromotion targets under low SNR and low radar pulse repetition frequency conditions. Recognition accuracy is greatly improved compared with the baseline methods.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于雷达网络系统的空间微动目标识别双通道 BiGRU-变压器-图融合网络
雷达网络系统(RNS)能够提供多波段、多视点的目标信息,有助于提高目标识别能力。提出了一种基于双通道双向门控循环单元(BiGRU) -变换图融合(DC-BiGT-GF)网络的空间微动目标识别方法。首先,利用基于BiGRU-Transformer的时间特征提取子网络对复杂值雷达截面的实部和虚部进行并行处理,获取局部和全局时间依赖关系;其次,设计空间特征提取子网络,将预定义图和自适应图相结合,提取潜在的空间依赖关系;在实部信道中,利用雷达之间的欧氏距离构造邻接矩阵来表示预定义的图结构,设计自适应邻接矩阵来学习端到端潜在的图结构。为了表示频域特征,将相位差应用于虚部信道以构建预定义的邻接矩阵。同时,利用余弦相似度计算自适应邻接矩阵,获得几何特征。最后,大量实验表明,在低信噪比和低雷达脉冲重复频率条件下,DC-BiGT-GF网络能够可靠地识别空间微动目标。与基线方法相比,识别精度大大提高。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
7.80
自引率
13.60%
发文量
433
审稿时长
8.7 months
期刊介绍: IEEE Transactions on Aerospace and Electronic Systems focuses on the organization, design, development, integration, and operation of complex systems for space, air, ocean, or ground environment. These systems include, but are not limited to, navigation, avionics, spacecraft, aerospace power, radar, sonar, telemetry, defense, transportation, automated testing, and command and control.
期刊最新文献
Operating Line of Battery-driven Aerodynamic Rotor for Electric Aircraft Applications Quantum Reinforcement Learning for Joint Control, Communication, and Computing in Stabilized Reusable Space Rocket Inter-satellite laser link enhanced GNSS orbit determination for high Earth orbit gravitational wave detectors FORM: Few-shot Online Learning for Radar-based Human Motion Recognition Variational Bayesian inference based 2D-DOAs estimation for time-varying number of dynamic sources
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1