{"title":"Joint multitask feature learning and classifier design","authors":"S. Gutta, Qi Cheng","doi":"10.1109/CISS.2013.6552296","DOIUrl":null,"url":null,"abstract":"The problem of classification arises in many realworld applications. Often classification of more than two classes is broken down into a group of binary classification problems using the one-versus-rest or pairwise approaches. For each binary classification problem, feature selection and classifier design are usually conducted separately. In this paper, we propose a new multitask learning approach in which feature selection and classifier design for all the binary classification tasks are carried out simultaneously. We consider probabilistic nonlinear kernel classifiers for binary classification. For each binary classifier, we give weights to the features within the kernels. We assume that the matrix consisting of all the feature weights for all the tasks has a sparse component and a low rank component. The sparse component determines the features that are relevant to each classifier, and the low rank component determines the common feature subspace that is relevant to all the classifiers. Experimental results on synthetic data demonstrate that the proposed approach achieves higher classification accuracy compared to the conventional classifiers. The proposed method accurately determines the relevant features that are important to each binary classifier.","PeriodicalId":268095,"journal":{"name":"2013 47th Annual Conference on Information Sciences and Systems (CISS)","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 47th Annual Conference on Information Sciences and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS.2013.6552296","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
The problem of classification arises in many realworld applications. Often classification of more than two classes is broken down into a group of binary classification problems using the one-versus-rest or pairwise approaches. For each binary classification problem, feature selection and classifier design are usually conducted separately. In this paper, we propose a new multitask learning approach in which feature selection and classifier design for all the binary classification tasks are carried out simultaneously. We consider probabilistic nonlinear kernel classifiers for binary classification. For each binary classifier, we give weights to the features within the kernels. We assume that the matrix consisting of all the feature weights for all the tasks has a sparse component and a low rank component. The sparse component determines the features that are relevant to each classifier, and the low rank component determines the common feature subspace that is relevant to all the classifiers. Experimental results on synthetic data demonstrate that the proposed approach achieves higher classification accuracy compared to the conventional classifiers. The proposed method accurately determines the relevant features that are important to each binary classifier.