Online Nonconvex Robust Tensor Principal Component Analysis

IF 8.9 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE transactions on neural networks and learning systems Pub Date : 2024-12-25 DOI:10.1109/TNNLS.2024.3519213
Lanlan Feng;Yipeng Liu;Ziming Liu;Ce Zhu
{"title":"Online Nonconvex Robust Tensor Principal Component Analysis","authors":"Lanlan Feng;Yipeng Liu;Ziming Liu;Ce Zhu","doi":"10.1109/TNNLS.2024.3519213","DOIUrl":null,"url":null,"abstract":"Robust tensor principal component analysis (RTPCA) based on tensor singular value decomposition (t-SVD) separates the low-rank component and the sparse component from the multiway data. For streaming data, online RTPCA (ORTPCA) processes tensor data sequentially, where the low-rank component is updated based on the latest estimation and the newly arrived sample. It enhances both computation and storage efficiency. However, in most of the existing ORTPCA methods, the relaxation from tensor multirank to the convex tensor nuclear norm (TNN) may have a certain modeling error, which leads to unavoidable tracking accuracy loss. In this article, a tensor Schatten-p norm (<inline-formula> <tex-math>$0\\lt p\\lt 1$ </tex-math></inline-formula>) is applied to provide a tighter approximation of the tensor rank. A Lemma is deduced to divide the Schatten-p norm into terms to be updated in an online way. Based on it, the corresponding online nonconvex RTPCA (ONRTPCA) method is proposed for efficient tensor subspace tracking. Moreover, we incorporate the dynamic forgetting window into ONRTPCA to adaptively track varying subspaces. In addition, this article also provides convergence analysis and complexity analysis. Experimental results on synthetic data and real-world video data show that our proposed method achieves superior subspace tracking accuracy in comparison with a series of state-of-the-art methods while maintaining a high convergence speed and low memory requirement.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 8","pages":"14384-14398"},"PeriodicalIF":8.9000,"publicationDate":"2024-12-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10815606/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Robust tensor principal component analysis (RTPCA) based on tensor singular value decomposition (t-SVD) separates the low-rank component and the sparse component from the multiway data. For streaming data, online RTPCA (ORTPCA) processes tensor data sequentially, where the low-rank component is updated based on the latest estimation and the newly arrived sample. It enhances both computation and storage efficiency. However, in most of the existing ORTPCA methods, the relaxation from tensor multirank to the convex tensor nuclear norm (TNN) may have a certain modeling error, which leads to unavoidable tracking accuracy loss. In this article, a tensor Schatten-p norm ( $0\lt p\lt 1$ ) is applied to provide a tighter approximation of the tensor rank. A Lemma is deduced to divide the Schatten-p norm into terms to be updated in an online way. Based on it, the corresponding online nonconvex RTPCA (ONRTPCA) method is proposed for efficient tensor subspace tracking. Moreover, we incorporate the dynamic forgetting window into ONRTPCA to adaptively track varying subspaces. In addition, this article also provides convergence analysis and complexity analysis. Experimental results on synthetic data and real-world video data show that our proposed method achieves superior subspace tracking accuracy in comparison with a series of state-of-the-art methods while maintaining a high convergence speed and low memory requirement.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
在线非凸鲁棒张量主成分分析
基于张量奇异值分解(t-SVD)的鲁棒张量主成分分析(RTPCA)将低秩分量和稀疏分量从多路数据中分离出来。对于流数据,在线RTPCA (ORTPCA)顺序处理张量数据,其中基于最新估计和新到达的样本更新低秩分量。它提高了计算效率和存储效率。然而,在现有的大多数ORTPCA方法中,从张量多秩到凸张量核范数(TNN)的松弛可能存在一定的建模误差,从而导致不可避免的跟踪精度损失。在本文中,应用了一个张量schattenp范数($0\lt p\lt 1$)来提供张量秩的更紧密近似值。导出了一个引理,将schattenp范数划分为在线更新的项。在此基础上,提出了相应的在线非凸RTPCA (ONRTPCA)方法用于张量子空间的高效跟踪。此外,我们将动态遗忘窗口引入到ONRTPCA中,以自适应跟踪变化的子空间。此外,本文还提供了收敛性分析和复杂性分析。在合成数据和真实视频数据上的实验结果表明,该方法在保持高收敛速度和低内存需求的同时,取得了较好的子空间跟踪精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
期刊最新文献
When Optimal Transport Meets Photo-Realistic Image Dehazing With Unpaired Training. Multistage PCA Whitening: A Robust Method to Dimensionality Reduction in Image Retrieval. S2FS: Spatially-Aware Separability-Driven Feature Selection in Fuzzy Decision Systems. Neural Architecture Search With Spatial-Spectral Attention for Higher-Order Nonlinear Hyperspectral Unmixing. Spatial Meta-Learning-Based Representation for Unseen Geographic Entities.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1