{"title":"D4R: 高维度下的双稳健缩减秩回归","authors":"Xiaoyan Ma , Lili Wei , Wanfeng Liang","doi":"10.1016/j.jspi.2024.106162","DOIUrl":null,"url":null,"abstract":"<div><p>In this paper, we study high-dimensional reduced rank regression and propose a doubly robust procedure, called <span><math><mi>D4R</mi></math></span>, meaning concurrent robustness to both outliers in predictors and heavy-tailed random noise. The proposed method uses the composite gradient descent based algorithm to solve the nonconvex optimization problem resulting from combining Tukey’s biweight loss with spectral regularization. Both theoretical and numerical properties of <span><math><mi>D4R</mi></math></span> are investigated. We establish non-asymptotic estimation error bounds under both the Frobenius norm and the nuclear norm in the high-dimensional setting. Simulation studies and real example show that the performance of <span><math><mi>D4R</mi></math></span> is better than that of several existing estimation methods.</p></div>","PeriodicalId":50039,"journal":{"name":"Journal of Statistical Planning and Inference","volume":"232 ","pages":"Article 106162"},"PeriodicalIF":0.8000,"publicationDate":"2024-02-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"D4R: Doubly robust reduced rank regression in high dimension\",\"authors\":\"Xiaoyan Ma , Lili Wei , Wanfeng Liang\",\"doi\":\"10.1016/j.jspi.2024.106162\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In this paper, we study high-dimensional reduced rank regression and propose a doubly robust procedure, called <span><math><mi>D4R</mi></math></span>, meaning concurrent robustness to both outliers in predictors and heavy-tailed random noise. The proposed method uses the composite gradient descent based algorithm to solve the nonconvex optimization problem resulting from combining Tukey’s biweight loss with spectral regularization. Both theoretical and numerical properties of <span><math><mi>D4R</mi></math></span> are investigated. We establish non-asymptotic estimation error bounds under both the Frobenius norm and the nuclear norm in the high-dimensional setting. Simulation studies and real example show that the performance of <span><math><mi>D4R</mi></math></span> is better than that of several existing estimation methods.</p></div>\",\"PeriodicalId\":50039,\"journal\":{\"name\":\"Journal of Statistical Planning and Inference\",\"volume\":\"232 \",\"pages\":\"Article 106162\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2024-02-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Statistical Planning and Inference\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0378375824000193\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Statistical Planning and Inference","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0378375824000193","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
D4R: Doubly robust reduced rank regression in high dimension
In this paper, we study high-dimensional reduced rank regression and propose a doubly robust procedure, called , meaning concurrent robustness to both outliers in predictors and heavy-tailed random noise. The proposed method uses the composite gradient descent based algorithm to solve the nonconvex optimization problem resulting from combining Tukey’s biweight loss with spectral regularization. Both theoretical and numerical properties of are investigated. We establish non-asymptotic estimation error bounds under both the Frobenius norm and the nuclear norm in the high-dimensional setting. Simulation studies and real example show that the performance of is better than that of several existing estimation methods.
期刊介绍:
The Journal of Statistical Planning and Inference offers itself as a multifaceted and all-inclusive bridge between classical aspects of statistics and probability, and the emerging interdisciplinary aspects that have a potential of revolutionizing the subject. While we maintain our traditional strength in statistical inference, design, classical probability, and large sample methods, we also have a far more inclusive and broadened scope to keep up with the new problems that confront us as statisticians, mathematicians, and scientists.
We publish high quality articles in all branches of statistics, probability, discrete mathematics, machine learning, and bioinformatics. We also especially welcome well written and up to date review articles on fundamental themes of statistics, probability, machine learning, and general biostatistics. Thoughtful letters to the editors, interesting problems in need of a solution, and short notes carrying an element of elegance or beauty are equally welcome.