{"title":"How are the Centered Kernel Principal Components Relevant to Regression Task? -An Exact Analysis","authors":"M. Yukawa, K. Müller, Yuto Ogino","doi":"10.1109/ICASSP.2018.8462392","DOIUrl":null,"url":null,"abstract":"We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with “uncentered” kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the “uncentered” kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.","PeriodicalId":6638,"journal":{"name":"2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","volume":"1 1","pages":"2841-2845"},"PeriodicalIF":0.0000,"publicationDate":"2018-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICASSP.2018.8462392","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with “uncentered” kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the “uncentered” kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.