Yifei Fan, Navdeep Dahiya, Samuel Bignardi, Romeil Sandhu, Anthony Yezzi
{"title":"Directionally Paired Principal Component Analysis for Bivariate Estimation Problems.","authors":"Yifei Fan, Navdeep Dahiya, Samuel Bignardi, Romeil Sandhu, Anthony Yezzi","doi":"10.1109/icpr48806.2021.9412245","DOIUrl":null,"url":null,"abstract":"<p><p>We propose Directionally Paired Principal Component Analysis (DP-PCA), a novel linear dimension-reduction model for estimating coupled yet partially observable variable sets. Unlike partial least squares methods (e.g., partial least squares regression and canonical correlation analysis) that maximize correlation/covariance between the two datasets, our DP-PCA directly minimizes, either conditionally or unconditionally, the reconstruction and prediction errors for the observable and unobservable part, respectively. We demonstrate the optimality of the proposed DP-PCA approach, we compare and evaluate relevant linear cross-decomposition methods with data reconstruction and prediction experiments on synthetic Gaussian data, multi-target regression datasets, and a single-channel image dataset. Results show that when only a single pair of bases is allowed, the conditional DP-PCA achieves the lowest reconstruction error on the observable part and the total variable sets as a whole; meanwhile, the unconditional DP-PCA reaches the lowest prediction errors on the unobservable part. When an extra budget is allowed for the observable part's PCA basis, one can reach an optimal solution using a combined method: standard PCA for the observable part and unconditional DP-PCA for the unobservable part.</p>","PeriodicalId":74516,"journal":{"name":"Proceedings of the ... IAPR International Conference on Pattern Recognition. International Conference on Pattern Recognition","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1109/icpr48806.2021.9412245","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... IAPR International Conference on Pattern Recognition. International Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/icpr48806.2021.9412245","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2021/5/5 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We propose Directionally Paired Principal Component Analysis (DP-PCA), a novel linear dimension-reduction model for estimating coupled yet partially observable variable sets. Unlike partial least squares methods (e.g., partial least squares regression and canonical correlation analysis) that maximize correlation/covariance between the two datasets, our DP-PCA directly minimizes, either conditionally or unconditionally, the reconstruction and prediction errors for the observable and unobservable part, respectively. We demonstrate the optimality of the proposed DP-PCA approach, we compare and evaluate relevant linear cross-decomposition methods with data reconstruction and prediction experiments on synthetic Gaussian data, multi-target regression datasets, and a single-channel image dataset. Results show that when only a single pair of bases is allowed, the conditional DP-PCA achieves the lowest reconstruction error on the observable part and the total variable sets as a whole; meanwhile, the unconditional DP-PCA reaches the lowest prediction errors on the unobservable part. When an extra budget is allowed for the observable part's PCA basis, one can reach an optimal solution using a combined method: standard PCA for the observable part and unconditional DP-PCA for the unobservable part.