High-Order CPD Estimation with Dimensionality Reduction Using a Tensor Train Model

Yassine Zniyed, R. Boyer, A. Almeida, G. Favier
{"title":"High-Order CPD Estimation with Dimensionality Reduction Using a Tensor Train Model","authors":"Yassine Zniyed, R. Boyer, A. Almeida, G. Favier","doi":"10.23919/EUSIPCO.2018.8553466","DOIUrl":null,"url":null,"abstract":"The canonical polyadic decomposition (CPD) is one of the most popular tensor-based analysis tools due to its usefulness in numerous fields of application. The Q-order CPD is parametrized by $Q$ matrices also called factors which have to be recovered. The factors estimation is usually carried out by means of the alternating least squares (ALS) algorithm. In the context of multi-modal big data analysis, i.e., large order $(Q)$ and dimensions, the ALS algorithm has two main drawbacks. Firstly, its convergence is generally slow and may fail, in particular for large values of $Q$, and secondly it is highly time consuming. In this paper, it is proved that a Q-order CPD of rank-R is equivalent to a train of $Q$ 3-order CPD(s) of rank-R. In other words, each tensor train (TT)-core admits a 3-order CPD of rank-R. Based on the structure of the TT-cores, a new dimensionality reduction and factor retrieval scheme is derived. The proposed method has a better robustness to noise with a smaller computational cost than the ALS algorithm.","PeriodicalId":303069,"journal":{"name":"2018 26th European Signal Processing Conference (EUSIPCO)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 26th European Signal Processing Conference (EUSIPCO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/EUSIPCO.2018.8553466","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

Abstract

The canonical polyadic decomposition (CPD) is one of the most popular tensor-based analysis tools due to its usefulness in numerous fields of application. The Q-order CPD is parametrized by $Q$ matrices also called factors which have to be recovered. The factors estimation is usually carried out by means of the alternating least squares (ALS) algorithm. In the context of multi-modal big data analysis, i.e., large order $(Q)$ and dimensions, the ALS algorithm has two main drawbacks. Firstly, its convergence is generally slow and may fail, in particular for large values of $Q$, and secondly it is highly time consuming. In this paper, it is proved that a Q-order CPD of rank-R is equivalent to a train of $Q$ 3-order CPD(s) of rank-R. In other words, each tensor train (TT)-core admits a 3-order CPD of rank-R. Based on the structure of the TT-cores, a new dimensionality reduction and factor retrieval scheme is derived. The proposed method has a better robustness to noise with a smaller computational cost than the ALS algorithm.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于张量序列模型的高阶CPD降维估计
典型多进分解(CPD)由于其在许多领域的应用而成为最流行的基于张量的分析工具之一。Q阶CPD由$Q$矩阵参数化,也称为必须恢复的因子。因子估计通常采用交替最小二乘(ALS)算法进行。在多模态大数据分析的背景下,即大阶$(Q)$和维度,ALS算法有两个主要的缺点。首先,它的收敛速度通常很慢,可能会失败,特别是对于较大的$Q$,其次,它非常耗时。本文证明了秩为- r的Q阶CPD等价于秩为- r的$Q$ 3阶CPD序列。也就是说,每个张量列(TT)核都有一个秩为r的3阶CPD。基于tt -核的结构,提出了一种新的降维和因子检索方案。该方法对噪声具有更好的鲁棒性,且计算量比ALS算法小。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Missing Sample Estimation Based on High-Order Sparse Linear Prediction for Audio Signals Multi-Shot Single Sensor Light Field Camera Using a Color Coded Mask Knowledge-Aided Normalized Iterative Hard Thresholding Algorithms for Sparse Recovery Two-Step Hybrid Multiuser Equalizer for Sub-Connected mmWave Massive MIMO SC-FDMA Systems How Much Will Tiny IoT Nodes Profit from Massive Base Station Arrays?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1