通过 ket 增强和自动加权策略的非凸方法实现稳健的张量恢复

IF 1.8 3区 数学 Q1 MATHEMATICS Numerical Linear Algebra with Applications Pub Date : 2024-07-30 DOI:10.1002/nla.2580
Wenhui Xie, Chen Ling, Hongjin He, Lei‐Hong Zhang
{"title":"通过 ket 增强和自动加权策略的非凸方法实现稳健的张量恢复","authors":"Wenhui Xie, Chen Ling, Hongjin He, Lei‐Hong Zhang","doi":"10.1002/nla.2580","DOIUrl":null,"url":null,"abstract":"In this article, we introduce a nonconvex tensor recovery approach, which employs the powerful ket augmentation technique to expand a low order tensor into a high‐order one so that we can exploit the advantage of tensor train (TT) decomposition tailored for high‐order tensors. Moreover, we define a new nonconvex surrogate function to approximate the tensor rank, and develop an auto‐weighted mechanism to adjust the weights of the resulting high‐order tensor's TT ranks. To make our approach robust, we add two mode‐unfolding regularization terms to enhance the model for the purpose of exploring spatio‐temporal continuity and self‐similarity of the underlying tensors. Also, we propose an implementable algorithm to solve the proposed optimization model in the sense that each subproblem enjoys a closed‐form solution. A series of numerical results demonstrate that our approach works well on recovering color images and videos.","PeriodicalId":49731,"journal":{"name":"Numerical Linear Algebra with Applications","volume":"25 1","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2024-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Robust tensor recovery via a nonconvex approach with ket augmentation and auto‐weighted strategy\",\"authors\":\"Wenhui Xie, Chen Ling, Hongjin He, Lei‐Hong Zhang\",\"doi\":\"10.1002/nla.2580\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this article, we introduce a nonconvex tensor recovery approach, which employs the powerful ket augmentation technique to expand a low order tensor into a high‐order one so that we can exploit the advantage of tensor train (TT) decomposition tailored for high‐order tensors. Moreover, we define a new nonconvex surrogate function to approximate the tensor rank, and develop an auto‐weighted mechanism to adjust the weights of the resulting high‐order tensor's TT ranks. To make our approach robust, we add two mode‐unfolding regularization terms to enhance the model for the purpose of exploring spatio‐temporal continuity and self‐similarity of the underlying tensors. Also, we propose an implementable algorithm to solve the proposed optimization model in the sense that each subproblem enjoys a closed‐form solution. A series of numerical results demonstrate that our approach works well on recovering color images and videos.\",\"PeriodicalId\":49731,\"journal\":{\"name\":\"Numerical Linear Algebra with Applications\",\"volume\":\"25 1\",\"pages\":\"\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2024-07-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Numerical Linear Algebra with Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1002/nla.2580\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Numerical Linear Algebra with Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1002/nla.2580","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0

摘要

在本文中,我们介绍了一种非凸张量恢复方法,该方法利用强大的 ket 增强技术将低阶张量扩展为高阶张量,从而利用为高阶张量量身定制的张量列车(TT)分解的优势。此外,我们定义了一个新的非凸替代函数来近似张量秩,并开发了一种自动加权机制来调整由此产生的高阶张量的 TT 秩的权重。为了使我们的方法具有鲁棒性,我们添加了两个模式解折正则化项来增强模型,以探索底层张量的时空连续性和自相似性。此外,我们还提出了一种可实施的算法来解决所提出的优化模型,即每个子问题都有一个闭式解。一系列数值结果表明,我们的方法在恢复彩色图像和视频时效果良好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Robust tensor recovery via a nonconvex approach with ket augmentation and auto‐weighted strategy
In this article, we introduce a nonconvex tensor recovery approach, which employs the powerful ket augmentation technique to expand a low order tensor into a high‐order one so that we can exploit the advantage of tensor train (TT) decomposition tailored for high‐order tensors. Moreover, we define a new nonconvex surrogate function to approximate the tensor rank, and develop an auto‐weighted mechanism to adjust the weights of the resulting high‐order tensor's TT ranks. To make our approach robust, we add two mode‐unfolding regularization terms to enhance the model for the purpose of exploring spatio‐temporal continuity and self‐similarity of the underlying tensors. Also, we propose an implementable algorithm to solve the proposed optimization model in the sense that each subproblem enjoys a closed‐form solution. A series of numerical results demonstrate that our approach works well on recovering color images and videos.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
3.40
自引率
2.30%
发文量
50
审稿时长
12 months
期刊介绍: Manuscripts submitted to Numerical Linear Algebra with Applications should include large-scale broad-interest applications in which challenging computational results are integral to the approach investigated and analysed. Manuscripts that, in the Editor’s view, do not satisfy these conditions will not be accepted for review. Numerical Linear Algebra with Applications receives submissions in areas that address developing, analysing and applying linear algebra algorithms for solving problems arising in multilinear (tensor) algebra, in statistics, such as Markov Chains, as well as in deterministic and stochastic modelling of large-scale networks, algorithm development, performance analysis or related computational aspects. Topics covered include: Standard and Generalized Conjugate Gradients, Multigrid and Other Iterative Methods; Preconditioning Methods; Direct Solution Methods; Numerical Methods for Eigenproblems; Newton-like Methods for Nonlinear Equations; Parallel and Vectorizable Algorithms in Numerical Linear Algebra; Application of Methods of Numerical Linear Algebra in Science, Engineering and Economics.
期刊最新文献
A Family of Inertial Three‐Term CGPMs for Large‐Scale Nonlinear Pseudo‐Monotone Equations With Convex Constraints Signal and image reconstruction with a double parameter Hager–Zhang‐type conjugate gradient method for system of nonlinear equations Superlinear Krylov convergence under streamline diffusion FEM for convection‐dominated elliptic operators On rank‐revealing QR factorizations of quaternion matrices Probabilistic perturbation bounds of matrix decompositions
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1