{"title":"核范数加1范数惩罚的大因子模型估计","authors":"Matteo Farnè, Angela Montanari","doi":"10.1016/j.jmva.2023.105244","DOIUrl":null,"url":null,"abstract":"<div><p>This paper provides a comprehensive estimation framework via nuclear norm plus <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span> norm penalization for high-dimensional approximate factor models with a sparse residual covariance. The underlying assumptions allow for non-pervasive latent eigenvalues and a prominent residual covariance pattern. In that context, existing approaches based on principal components may lead to misestimate the latent rank. On the contrary, the proposed optimization strategy recovers with high probability both the covariance matrix components and the latent rank and the residual sparsity pattern. Conditioning on the recovered low rank and sparse matrix varieties, we derive the finite sample covariance matrix estimators with the tightest error bound in minimax sense and we prove that the ensuing estimators of factor loadings and scores via Bartlett’s and Thomson’s methods have the same property. The asymptotic rates for those estimators of factor loadings and scores are also provided.</p></div>","PeriodicalId":16431,"journal":{"name":"Journal of Multivariate Analysis","volume":"199 ","pages":"Article 105244"},"PeriodicalIF":1.4000,"publicationDate":"2023-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0047259X23000908/pdfft?md5=728de694d0d649b95d2f5a00e75117a5&pid=1-s2.0-S0047259X23000908-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Large factor model estimation by nuclear norm plus ℓ1 norm penalization\",\"authors\":\"Matteo Farnè, Angela Montanari\",\"doi\":\"10.1016/j.jmva.2023.105244\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>This paper provides a comprehensive estimation framework via nuclear norm plus <span><math><msub><mrow><mi>ℓ</mi></mrow><mrow><mn>1</mn></mrow></msub></math></span> norm penalization for high-dimensional approximate factor models with a sparse residual covariance. The underlying assumptions allow for non-pervasive latent eigenvalues and a prominent residual covariance pattern. In that context, existing approaches based on principal components may lead to misestimate the latent rank. On the contrary, the proposed optimization strategy recovers with high probability both the covariance matrix components and the latent rank and the residual sparsity pattern. Conditioning on the recovered low rank and sparse matrix varieties, we derive the finite sample covariance matrix estimators with the tightest error bound in minimax sense and we prove that the ensuing estimators of factor loadings and scores via Bartlett’s and Thomson’s methods have the same property. The asymptotic rates for those estimators of factor loadings and scores are also provided.</p></div>\",\"PeriodicalId\":16431,\"journal\":{\"name\":\"Journal of Multivariate Analysis\",\"volume\":\"199 \",\"pages\":\"Article 105244\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-10-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S0047259X23000908/pdfft?md5=728de694d0d649b95d2f5a00e75117a5&pid=1-s2.0-S0047259X23000908-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Multivariate Analysis\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0047259X23000908\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Multivariate Analysis","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0047259X23000908","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Large factor model estimation by nuclear norm plus ℓ1 norm penalization
This paper provides a comprehensive estimation framework via nuclear norm plus norm penalization for high-dimensional approximate factor models with a sparse residual covariance. The underlying assumptions allow for non-pervasive latent eigenvalues and a prominent residual covariance pattern. In that context, existing approaches based on principal components may lead to misestimate the latent rank. On the contrary, the proposed optimization strategy recovers with high probability both the covariance matrix components and the latent rank and the residual sparsity pattern. Conditioning on the recovered low rank and sparse matrix varieties, we derive the finite sample covariance matrix estimators with the tightest error bound in minimax sense and we prove that the ensuing estimators of factor loadings and scores via Bartlett’s and Thomson’s methods have the same property. The asymptotic rates for those estimators of factor loadings and scores are also provided.
期刊介绍:
Founded in 1971, the Journal of Multivariate Analysis (JMVA) is the central venue for the publication of new, relevant methodology and particularly innovative applications pertaining to the analysis and interpretation of multidimensional data.
The journal welcomes contributions to all aspects of multivariate data analysis and modeling, including cluster analysis, discriminant analysis, factor analysis, and multidimensional continuous or discrete distribution theory. Topics of current interest include, but are not limited to, inferential aspects of
Copula modeling
Functional data analysis
Graphical modeling
High-dimensional data analysis
Image analysis
Multivariate extreme-value theory
Sparse modeling
Spatial statistics.