{"title":"基于Douglas-Rachford分裂算法的特征选择鲁棒分类","authors":"M. Barlaud, M. Antonini","doi":"10.1051/proc/202171102","DOIUrl":null,"url":null,"abstract":"This paper deals with supervised classification and feature selection with application in the context of high dimensional features. A classical approach leads to an optimization problem minimizing the within sum of squares in the clusters (I2 norm) with an I1 penalty in order to promote sparsity. It has been known for decades that I1 norm is more robust than I2 norm to outliers. In this paper, we deal with this issue using a new proximal splitting method for the minimization of a criterion using I2 norm both for the constraint and the loss function. Since the I1 criterion is only convex and not gradient Lipschitz, we advocate the use of a Douglas-Rachford minimization solution. We take advantage of the particular form of the cost and, using a change of variable, we provide a new efficient tailored primal Douglas-Rachford splitting algorithm which is very effective on high dimensional dataset. We also provide an efficient classifier in the projected space based on medoid modeling. Experiments on two biological datasets and a computer vision dataset show that our method significantly improves the results compared to those obtained using a quadratic loss function.","PeriodicalId":53260,"journal":{"name":"ESAIM Proceedings and Surveys","volume":"2 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Robust classification with feature selection using an application of the Douglas-Rachford splitting algorithm\",\"authors\":\"M. Barlaud, M. Antonini\",\"doi\":\"10.1051/proc/202171102\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper deals with supervised classification and feature selection with application in the context of high dimensional features. A classical approach leads to an optimization problem minimizing the within sum of squares in the clusters (I2 norm) with an I1 penalty in order to promote sparsity. It has been known for decades that I1 norm is more robust than I2 norm to outliers. In this paper, we deal with this issue using a new proximal splitting method for the minimization of a criterion using I2 norm both for the constraint and the loss function. Since the I1 criterion is only convex and not gradient Lipschitz, we advocate the use of a Douglas-Rachford minimization solution. We take advantage of the particular form of the cost and, using a change of variable, we provide a new efficient tailored primal Douglas-Rachford splitting algorithm which is very effective on high dimensional dataset. We also provide an efficient classifier in the projected space based on medoid modeling. Experiments on two biological datasets and a computer vision dataset show that our method significantly improves the results compared to those obtained using a quadratic loss function.\",\"PeriodicalId\":53260,\"journal\":{\"name\":\"ESAIM Proceedings and Surveys\",\"volume\":\"2 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ESAIM Proceedings and Surveys\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1051/proc/202171102\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ESAIM Proceedings and Surveys","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1051/proc/202171102","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Robust classification with feature selection using an application of the Douglas-Rachford splitting algorithm
This paper deals with supervised classification and feature selection with application in the context of high dimensional features. A classical approach leads to an optimization problem minimizing the within sum of squares in the clusters (I2 norm) with an I1 penalty in order to promote sparsity. It has been known for decades that I1 norm is more robust than I2 norm to outliers. In this paper, we deal with this issue using a new proximal splitting method for the minimization of a criterion using I2 norm both for the constraint and the loss function. Since the I1 criterion is only convex and not gradient Lipschitz, we advocate the use of a Douglas-Rachford minimization solution. We take advantage of the particular form of the cost and, using a change of variable, we provide a new efficient tailored primal Douglas-Rachford splitting algorithm which is very effective on high dimensional dataset. We also provide an efficient classifier in the projected space based on medoid modeling. Experiments on two biological datasets and a computer vision dataset show that our method significantly improves the results compared to those obtained using a quadratic loss function.