{"title":"通过秩发散实现充分降维的新方法","authors":"Tianqing Liu, Danning Li, Fengjiao Ren, Jianguo Sun, Xiaohui Yuan","doi":"10.1007/s11749-024-00929-7","DOIUrl":null,"url":null,"abstract":"<p>Sufficient dimension reduction is commonly performed to achieve data reduction and help data visualization. Its main goal is to identify functions of the predictors that are smaller in number than the predictors and contain the same information as the predictors for the response. In this paper, we are concerned with the linear functions of the predictors, which determine a central subspace that preserves sufficient information about the conditional distribution of a response given covariates. Many methods have been developed in the literature for the estimation of the central subspace. However, most of the existing sufficient dimension reduction methods are sensitive to outliers and require some strict restrictions on both covariates and response. To address this, we propose a novel dependence measure, rank divergence, and develop a rank divergence-based sufficient dimension reduction approach. The new method only requires some mild conditions on the covariates and response and is robust to outliers or heavy-tailed distributions. Moreover, it applies to both discrete or categorical covariates and multivariate responses. The consistency of the resulting estimator of the central subspace is established, and numerical studies suggest that it works well in practical situations.</p>","PeriodicalId":51189,"journal":{"name":"Test","volume":"72 1","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2024-05-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A new sufficient dimension reduction method via rank divergence\",\"authors\":\"Tianqing Liu, Danning Li, Fengjiao Ren, Jianguo Sun, Xiaohui Yuan\",\"doi\":\"10.1007/s11749-024-00929-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Sufficient dimension reduction is commonly performed to achieve data reduction and help data visualization. Its main goal is to identify functions of the predictors that are smaller in number than the predictors and contain the same information as the predictors for the response. In this paper, we are concerned with the linear functions of the predictors, which determine a central subspace that preserves sufficient information about the conditional distribution of a response given covariates. Many methods have been developed in the literature for the estimation of the central subspace. However, most of the existing sufficient dimension reduction methods are sensitive to outliers and require some strict restrictions on both covariates and response. To address this, we propose a novel dependence measure, rank divergence, and develop a rank divergence-based sufficient dimension reduction approach. The new method only requires some mild conditions on the covariates and response and is robust to outliers or heavy-tailed distributions. Moreover, it applies to both discrete or categorical covariates and multivariate responses. The consistency of the resulting estimator of the central subspace is established, and numerical studies suggest that it works well in practical situations.</p>\",\"PeriodicalId\":51189,\"journal\":{\"name\":\"Test\",\"volume\":\"72 1\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2024-05-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Test\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s11749-024-00929-7\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Test","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s11749-024-00929-7","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
A new sufficient dimension reduction method via rank divergence
Sufficient dimension reduction is commonly performed to achieve data reduction and help data visualization. Its main goal is to identify functions of the predictors that are smaller in number than the predictors and contain the same information as the predictors for the response. In this paper, we are concerned with the linear functions of the predictors, which determine a central subspace that preserves sufficient information about the conditional distribution of a response given covariates. Many methods have been developed in the literature for the estimation of the central subspace. However, most of the existing sufficient dimension reduction methods are sensitive to outliers and require some strict restrictions on both covariates and response. To address this, we propose a novel dependence measure, rank divergence, and develop a rank divergence-based sufficient dimension reduction approach. The new method only requires some mild conditions on the covariates and response and is robust to outliers or heavy-tailed distributions. Moreover, it applies to both discrete or categorical covariates and multivariate responses. The consistency of the resulting estimator of the central subspace is established, and numerical studies suggest that it works well in practical situations.
期刊介绍:
TEST is an international journal of Statistics and Probability, sponsored by the Spanish Society of Statistics and Operations Research. English is the official language of the journal.
The emphasis of TEST is placed on papers containing original theoretical contributions of direct or potential value in applications. In this respect, the methodological contents are considered to be crucial for the papers published in TEST, but the practical implications of the methodological aspects are also relevant. Original sound manuscripts on either well-established or emerging areas in the scope of the journal are welcome.
One volume is published annually in four issues. In addition to the regular contributions, each issue of TEST contains an invited paper from a world-wide recognized outstanding statistician on an up-to-date challenging topic, including discussions.