SAR Target Recognition Method Based on Adaptive Weighted Decision Fusion of Deep Features

IF 0.6 Q4 ENGINEERING, ELECTRICAL & ELECTRONIC Recent Advances in Electrical & Electronic Engineering Pub Date : 2023-10-10 DOI:10.2174/0123520965262459231002051022
Xiaoguang Su
{"title":"SAR Target Recognition Method Based on Adaptive Weighted Decision Fusion of Deep Features","authors":"Xiaoguang Su","doi":"10.2174/0123520965262459231002051022","DOIUrl":null,"url":null,"abstract":"background: This paper proposes a synthetic aperture radar (SAR) target recognition method based on adaptive weighted decision fusion of multi-level deep features. methods: The trained ResNet-18 is employed to extract multi-level deep features from SAR images. Afterwards, based on the joint sparse representation (JSR) model, the multi-level deep features are represented to obtain the corresponding reconstruction error vectors. Considering the differences in the abilities of different levels of features to distinguish the target, the reconstruction error vectors are analyzed based on entropy theory, and their corresponding weights are adaptively obtained. Finally, the fused reconstruction error result is obtained through adaptively weighted fusion, and the target label is determined accordingly. results: Experiments are conducted on the Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset under different conditions, and the proposed method is compared with published methods, including multi-feature decision fusion, JSR-based decision fusion and other types of ResNets. conclusion: The experimental results under standard operating condition (SOC) and extended operating conditions (EOCs) including depression angle variance and noise corruption validate the advantages of the proposed method.","PeriodicalId":43275,"journal":{"name":"Recent Advances in Electrical & Electronic Engineering","volume":"58 2 1","pages":"0"},"PeriodicalIF":0.6000,"publicationDate":"2023-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Recent Advances in Electrical & Electronic Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2174/0123520965262459231002051022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

background: This paper proposes a synthetic aperture radar (SAR) target recognition method based on adaptive weighted decision fusion of multi-level deep features. methods: The trained ResNet-18 is employed to extract multi-level deep features from SAR images. Afterwards, based on the joint sparse representation (JSR) model, the multi-level deep features are represented to obtain the corresponding reconstruction error vectors. Considering the differences in the abilities of different levels of features to distinguish the target, the reconstruction error vectors are analyzed based on entropy theory, and their corresponding weights are adaptively obtained. Finally, the fused reconstruction error result is obtained through adaptively weighted fusion, and the target label is determined accordingly. results: Experiments are conducted on the Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset under different conditions, and the proposed method is compared with published methods, including multi-feature decision fusion, JSR-based decision fusion and other types of ResNets. conclusion: The experimental results under standard operating condition (SOC) and extended operating conditions (EOCs) including depression angle variance and noise corruption validate the advantages of the proposed method.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于深度特征自适应加权决策融合的SAR目标识别方法
提出了一种基于多层次深度特征自适应加权决策融合的合成孔径雷达(SAR)目标识别方法。方法:利用训练好的ResNet-18对SAR图像进行多层次深度特征提取。然后,基于联合稀疏表示(JSR)模型,对多层深度特征进行表示,得到相应的重构误差向量。考虑到不同层次特征对目标识别能力的差异,基于熵理论对重构误差向量进行分析,并自适应获得其对应的权值。最后,通过自适应加权融合得到融合重建误差结果,并据此确定目标标号。结果:在不同条件下的运动和静止目标获取与识别(MSTAR)数据集上进行了实验,并与已发表的多特征决策融合、基于jsr的决策融合以及其他类型的ResNets方法进行了比较。结论:在标准工作条件(SOC)和扩展工作条件(EOCs)下的实验结果(包括俯角变化和噪声损坏)验证了该方法的优越性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Recent Advances in Electrical & Electronic Engineering
Recent Advances in Electrical & Electronic Engineering ENGINEERING, ELECTRICAL & ELECTRONIC-
CiteScore
1.70
自引率
16.70%
发文量
101
期刊介绍: Recent Advances in Electrical & Electronic Engineering publishes full-length/mini reviews and research articles, guest edited thematic issues on electrical and electronic engineering and applications. The journal also covers research in fast emerging applications of electrical power supply, electrical systems, power transmission, electromagnetism, motor control process and technologies involved and related to electrical and electronic engineering. The journal is essential reading for all researchers in electrical and electronic engineering science.
期刊最新文献
Solar and Wind-based Renewable DGs and DSTATCOM Allotment in Distribution System with Consideration of Various Load Models Using Spotted Hyena Optimizer Algorithm Soft Switching Technique in a Modified SEPIC Converter with MPPT using Cuckoo Search Algorithm An Adaptive Framework for Traffic Congestion Prediction Using Deep Learning Augmented Reality Control Based Energy Management System for Residence Mitigation of the Impact of Incorporating Charging Stations for Electric Vehicles Using Solar-based Renewable DG on the Electrical Distribution System
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1