{"title":"Multi‐view side information‐incorporated tensor completion","authors":"Yingjie Tian, Xiaotong Yu, Saiji Fu","doi":"10.1002/nla.2485","DOIUrl":null,"url":null,"abstract":"Tensor completion originates in numerous applications where data utilized are of high dimensions and gathered from multiple sources or views. Existing methods merely incorporate the structure information, ignoring the fact that ubiquitous side information may be beneficial to estimate the missing entries from a partially observed tensor. Inspired by this, we formulate a sparse and low‐rank tensor completion model named SLRMV. The ℓ0$$ {\\ell}_0 $$ ‐norm instead of its relaxation is used in the objective function to constrain the sparseness of noise. The CP decomposition is used to decompose the high‐quality tensor, based on which the combination of Schatten p$$ p $$ ‐norm on each latent factor matrix is employed to characterize the low‐rank tensor structure with high computation efficiency. Diverse similarity matrices for the same factor matrix are regarded as multi‐view side information for guiding the tensor completion task. Although SLRMV is a nonconvex and discontinuous problem, the optimality analysis in terms of Karush‐Kuhn‐Tucker (KKT) conditions is accordingly proposed, based on which a hard‐thresholding based alternating direction method of multipliers (HT‐ADMM) is designed. Extensive experiments remarkably demonstrate the efficiency of SLRMV in tensor completion.","PeriodicalId":49731,"journal":{"name":"Numerical Linear Algebra with Applications","volume":" ","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2022-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Numerical Linear Algebra with Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1002/nla.2485","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0
Abstract
Tensor completion originates in numerous applications where data utilized are of high dimensions and gathered from multiple sources or views. Existing methods merely incorporate the structure information, ignoring the fact that ubiquitous side information may be beneficial to estimate the missing entries from a partially observed tensor. Inspired by this, we formulate a sparse and low‐rank tensor completion model named SLRMV. The ℓ0$$ {\ell}_0 $$ ‐norm instead of its relaxation is used in the objective function to constrain the sparseness of noise. The CP decomposition is used to decompose the high‐quality tensor, based on which the combination of Schatten p$$ p $$ ‐norm on each latent factor matrix is employed to characterize the low‐rank tensor structure with high computation efficiency. Diverse similarity matrices for the same factor matrix are regarded as multi‐view side information for guiding the tensor completion task. Although SLRMV is a nonconvex and discontinuous problem, the optimality analysis in terms of Karush‐Kuhn‐Tucker (KKT) conditions is accordingly proposed, based on which a hard‐thresholding based alternating direction method of multipliers (HT‐ADMM) is designed. Extensive experiments remarkably demonstrate the efficiency of SLRMV in tensor completion.
期刊介绍:
Manuscripts submitted to Numerical Linear Algebra with Applications should include large-scale broad-interest applications in which challenging computational results are integral to the approach investigated and analysed. Manuscripts that, in the Editor’s view, do not satisfy these conditions will not be accepted for review.
Numerical Linear Algebra with Applications receives submissions in areas that address developing, analysing and applying linear algebra algorithms for solving problems arising in multilinear (tensor) algebra, in statistics, such as Markov Chains, as well as in deterministic and stochastic modelling of large-scale networks, algorithm development, performance analysis or related computational aspects.
Topics covered include: Standard and Generalized Conjugate Gradients, Multigrid and Other Iterative Methods; Preconditioning Methods; Direct Solution Methods; Numerical Methods for Eigenproblems; Newton-like Methods for Nonlinear Equations; Parallel and Vectorizable Algorithms in Numerical Linear Algebra; Application of Methods of Numerical Linear Algebra in Science, Engineering and Economics.