{"title":"MPCA 和 MDA 通过爱因斯坦产品","authors":"Aoulaia Andahmou","doi":"10.1007/s40314-024-02866-5","DOIUrl":null,"url":null,"abstract":"<p>This work deals with the problem of multilinear principal component analysis (MPCA) and multilinear discriminant analysis (MDA), that solve for a tensor to tensor projection (TTP) using Einstein product. MPCA and MDA are considered as a higher-order extension of principal component analysis (PCA ) and linear discriminant analysis (LDA), respectively. MPCA seeks to find a low-dimensional representation that captures most of the variation present in the original data tensor. Whereas MDA seeks to find discriminative features that maximize the separation between classes, while preserving the multilinear structure. Specifically, we are interested in finding a projective tensor that maps the original data tensor onto a new lower-dimensional subspace. In this paper, we propose to solve the MPCA problem by employing the global Lanczos procedure via Einstein product for a fourth-order tensor, while solving the MDA problem by combining Newton method and global tensorial Lanczos method. The numerical experiments illustrate the use of these algorithms for face recognition problems, compression and classification.\n</p>","PeriodicalId":51278,"journal":{"name":"Computational and Applied Mathematics","volume":"57 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"MPCA and MDA via Einstein product\",\"authors\":\"Aoulaia Andahmou\",\"doi\":\"10.1007/s40314-024-02866-5\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>This work deals with the problem of multilinear principal component analysis (MPCA) and multilinear discriminant analysis (MDA), that solve for a tensor to tensor projection (TTP) using Einstein product. MPCA and MDA are considered as a higher-order extension of principal component analysis (PCA ) and linear discriminant analysis (LDA), respectively. MPCA seeks to find a low-dimensional representation that captures most of the variation present in the original data tensor. Whereas MDA seeks to find discriminative features that maximize the separation between classes, while preserving the multilinear structure. Specifically, we are interested in finding a projective tensor that maps the original data tensor onto a new lower-dimensional subspace. In this paper, we propose to solve the MPCA problem by employing the global Lanczos procedure via Einstein product for a fourth-order tensor, while solving the MDA problem by combining Newton method and global tensorial Lanczos method. The numerical experiments illustrate the use of these algorithms for face recognition problems, compression and classification.\\n</p>\",\"PeriodicalId\":51278,\"journal\":{\"name\":\"Computational and Applied Mathematics\",\"volume\":\"57 1\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2024-08-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational and Applied Mathematics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s40314-024-02866-5\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational and Applied Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s40314-024-02866-5","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
This work deals with the problem of multilinear principal component analysis (MPCA) and multilinear discriminant analysis (MDA), that solve for a tensor to tensor projection (TTP) using Einstein product. MPCA and MDA are considered as a higher-order extension of principal component analysis (PCA ) and linear discriminant analysis (LDA), respectively. MPCA seeks to find a low-dimensional representation that captures most of the variation present in the original data tensor. Whereas MDA seeks to find discriminative features that maximize the separation between classes, while preserving the multilinear structure. Specifically, we are interested in finding a projective tensor that maps the original data tensor onto a new lower-dimensional subspace. In this paper, we propose to solve the MPCA problem by employing the global Lanczos procedure via Einstein product for a fourth-order tensor, while solving the MDA problem by combining Newton method and global tensorial Lanczos method. The numerical experiments illustrate the use of these algorithms for face recognition problems, compression and classification.
期刊介绍:
Computational & Applied Mathematics began to be published in 1981. This journal was conceived as the main scientific publication of SBMAC (Brazilian Society of Computational and Applied Mathematics).
The objective of the journal is the publication of original research in Applied and Computational Mathematics, with interfaces in Physics, Engineering, Chemistry, Biology, Operations Research, Statistics, Social Sciences and Economy. The journal has the usual quality standards of scientific international journals and we aim high level of contributions in terms of originality, depth and relevance.