{"title":"Statistical performance of quantile tensor regression with convex regularization","authors":"Wenqi Lu , Zhongyi Zhu , Rui Li , Heng Lian","doi":"10.1016/j.jmva.2023.105249","DOIUrl":null,"url":null,"abstract":"<div><p><span><span>In this paper, we consider high-dimensional quantile<span> tensor regression using a general convex decomposable regularizer and analyze the statistical performances of the estimator. The rates are stated in terms of the intrinsic dimension of the estimation problem, which is, roughly speaking, the dimension of the smallest subspace that contains the true coefficient. Previously, convex regularized tensor regression has been studied with a least squares loss, Gaussian tensorial predictors and Gaussian errors, with rates that depend on the Gaussian width of a convex set. Our results extend the previous work to nonsmooth quantile loss. To deal with the non-Gaussian setting, we use the concept of </span></span>Rademacher<span><span> complexity with appropriate concentration inequalities instead of the Gaussian width. For the multi-linear nuclear norm penalty, our Orlicz norm bound for the operator norm of a random matrix may be of independent interest. We validate the theoretical guarantees in numerical experiments. We also demonstrate advantage of quantile regression over mean regression, and compare the performance of convex </span>regularization method and nonconvex </span></span>decomposition method in solving quantile tensor regression problem in simulation studies.</p></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":"200 ","pages":"Article 105249"},"PeriodicalIF":1.4000,"publicationDate":"2023-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X23000957","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we consider high-dimensional quantile tensor regression using a general convex decomposable regularizer and analyze the statistical performances of the estimator. The rates are stated in terms of the intrinsic dimension of the estimation problem, which is, roughly speaking, the dimension of the smallest subspace that contains the true coefficient. Previously, convex regularized tensor regression has been studied with a least squares loss, Gaussian tensorial predictors and Gaussian errors, with rates that depend on the Gaussian width of a convex set. Our results extend the previous work to nonsmooth quantile loss. To deal with the non-Gaussian setting, we use the concept of Rademacher complexity with appropriate concentration inequalities instead of the Gaussian width. For the multi-linear nuclear norm penalty, our Orlicz norm bound for the operator norm of a random matrix may be of independent interest. We validate the theoretical guarantees in numerical experiments. We also demonstrate advantage of quantile regression over mean regression, and compare the performance of convex regularization method and nonconvex decomposition method in solving quantile tensor regression problem in simulation studies.
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.