Underwater moving target detection using online robust principal component analysis and multimodal anomaly detection.

IF 2.1 2区 物理与天体物理 Q2 ACOUSTICS Journal of the Acoustical Society of America Pub Date : 2025-01-01 DOI:10.1121/10.0034831
Shaofeng Zou, Xuyang Wang, Tao Yuan, Kaihui Zeng, Guolin Li, Xiang Xie
{"title":"Underwater moving target detection using online robust principal component analysis and multimodal anomaly detection.","authors":"Shaofeng Zou, Xuyang Wang, Tao Yuan, Kaihui Zeng, Guolin Li, Xiang Xie","doi":"10.1121/10.0034831","DOIUrl":null,"url":null,"abstract":"<p><p>In shallow water, reverberation complicates the detection of low-intensity, variable-echo moving targets, such as divers. Traditional methods often fail to distinguish these targets from reverberation, and data-driven methods are constrained by the limited data on intruding targets. This paper introduces the online robust principal component analysis and multimodal anomaly detection (ORMAD) method to address these challenges. ORMAD efficiently performs online low-rank and sparse decomposition while utilizing unsupervised multimodal anomaly detection to enhance detection performance. The multimodal anomaly detection process involves two phases: modality extraction and anomaly detection. During modality extraction, echo data are separated into echo structure and spatial trajectory modalities, providing complementary information that improves the network representation of both reverberation and moving targets. The subsequent anomaly detection phase unsupervisedly learns the modalities of fluctuating reverberation, thereby achieving stable reconstruction while maintaining sensitivity to moving targets. This sensitivity allows effective identification of moving targets by detecting reconstruction loss. Experimental results demonstrate that ORMAD effectively improves detection performance in complex reverberation scenarios. In a real-world sonar dataset, ORMAD increased the average precision for detecting diver targets from 60% to 75% compared to the state-of-the-art method.</p>","PeriodicalId":17168,"journal":{"name":"Journal of the Acoustical Society of America","volume":"157 1","pages":"122-136"},"PeriodicalIF":2.1000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of the Acoustical Society of America","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1121/10.0034831","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ACOUSTICS","Score":null,"Total":0}
引用次数: 0

Abstract

In shallow water, reverberation complicates the detection of low-intensity, variable-echo moving targets, such as divers. Traditional methods often fail to distinguish these targets from reverberation, and data-driven methods are constrained by the limited data on intruding targets. This paper introduces the online robust principal component analysis and multimodal anomaly detection (ORMAD) method to address these challenges. ORMAD efficiently performs online low-rank and sparse decomposition while utilizing unsupervised multimodal anomaly detection to enhance detection performance. The multimodal anomaly detection process involves two phases: modality extraction and anomaly detection. During modality extraction, echo data are separated into echo structure and spatial trajectory modalities, providing complementary information that improves the network representation of both reverberation and moving targets. The subsequent anomaly detection phase unsupervisedly learns the modalities of fluctuating reverberation, thereby achieving stable reconstruction while maintaining sensitivity to moving targets. This sensitivity allows effective identification of moving targets by detecting reconstruction loss. Experimental results demonstrate that ORMAD effectively improves detection performance in complex reverberation scenarios. In a real-world sonar dataset, ORMAD increased the average precision for detecting diver targets from 60% to 75% compared to the state-of-the-art method.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于在线鲁棒主成分分析和多模态异常检测的水下运动目标检测。
在浅水中,混响使低强度、可变回波移动目标(如潜水员)的探测变得复杂。传统的方法往往无法将这些目标与混响区分开来,而数据驱动的方法又受到入侵目标数据有限的限制。本文引入在线鲁棒主成分分析和多模态异常检测(ORMAD)方法来解决这些问题。ORMAD有效地进行在线低秩和稀疏分解,同时利用无监督多模态异常检测来提高检测性能。多模态异常检测过程包括两个阶段:模态提取和异常检测。在模态提取过程中,回波数据被分离为回波结构模态和空间轨迹模态,提供了互补信息,改善了混响和运动目标的网络表示。随后的异常检测阶段无监督地学习波动混响的模态,从而在保持对运动目标的敏感性的同时实现稳定的重建。这种灵敏度允许通过检测重建损失有效地识别运动目标。实验结果表明,ORMAD有效地提高了复杂混响场景下的检测性能。在真实的声纳数据集中,与最先进的方法相比,ORMAD将探测潜水员目标的平均精度从60%提高到75%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
4.60
自引率
16.70%
发文量
1433
审稿时长
4.7 months
期刊介绍: Since 1929 The Journal of the Acoustical Society of America has been the leading source of theoretical and experimental research results in the broad interdisciplinary study of sound. Subject coverage includes: linear and nonlinear acoustics; aeroacoustics, underwater sound and acoustical oceanography; ultrasonics and quantum acoustics; architectural and structural acoustics and vibration; speech, music and noise; psychology and physiology of hearing; engineering acoustics, transduction; bioacoustics, animal bioacoustics.
期刊最新文献
All we know about anechoic chambers. Temporal patterns in Malaysian rainforest soundscapes demonstrated using acoustic indices and deep embeddings trained on time-of-day estimationa). Validation of a three-dimensional model for improving the design of multiple-backscattering ultrasonic sensors. A combined noise source model based on vertical coherence to quantify the proportions of two types of noise power. A small cavity for detecting sound-induced flow.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1