{"title":"类先验变化下的半监督充分降维","authors":"Hideko Kawakubo, Masashi Sugiyama","doi":"10.1109/TAAI.2016.7880182","DOIUrl":null,"url":null,"abstract":"Sufficient dimension reduction (SDR) is a popular framework for supervised dimension reduction, aiming at reducing the dimensionality of input data while information on output data is maximally maintained. On the other hand, in many recent supervised classification learning tasks, it is conceivable that the balance of samples in each class varies between the training and testing phases. Such a phenomenon, referred to as class-prior change, causes existing SDR methods to perform undesirably particularly when the training data is highly imbalanced. In this paper, we extend the state-of-the-art SDR method called leastsquares gradients for dimension reduction (LSGDR) to be able to cope with such class-prior change under the semi-supervised learning setup where unlabeled test data are available in addition to labeled training data. Through experiments, we demonstrate the usefulness of our proposed method.","PeriodicalId":159858,"journal":{"name":"2016 Conference on Technologies and Applications of Artificial Intelligence (TAAI)","volume":"47 4","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Semi-supervised sufficient dimension reduction under class-prior change\",\"authors\":\"Hideko Kawakubo, Masashi Sugiyama\",\"doi\":\"10.1109/TAAI.2016.7880182\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sufficient dimension reduction (SDR) is a popular framework for supervised dimension reduction, aiming at reducing the dimensionality of input data while information on output data is maximally maintained. On the other hand, in many recent supervised classification learning tasks, it is conceivable that the balance of samples in each class varies between the training and testing phases. Such a phenomenon, referred to as class-prior change, causes existing SDR methods to perform undesirably particularly when the training data is highly imbalanced. In this paper, we extend the state-of-the-art SDR method called leastsquares gradients for dimension reduction (LSGDR) to be able to cope with such class-prior change under the semi-supervised learning setup where unlabeled test data are available in addition to labeled training data. Through experiments, we demonstrate the usefulness of our proposed method.\",\"PeriodicalId\":159858,\"journal\":{\"name\":\"2016 Conference on Technologies and Applications of Artificial Intelligence (TAAI)\",\"volume\":\"47 4\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 Conference on Technologies and Applications of Artificial Intelligence (TAAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TAAI.2016.7880182\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Conference on Technologies and Applications of Artificial Intelligence (TAAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TAAI.2016.7880182","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Semi-supervised sufficient dimension reduction under class-prior change
Sufficient dimension reduction (SDR) is a popular framework for supervised dimension reduction, aiming at reducing the dimensionality of input data while information on output data is maximally maintained. On the other hand, in many recent supervised classification learning tasks, it is conceivable that the balance of samples in each class varies between the training and testing phases. Such a phenomenon, referred to as class-prior change, causes existing SDR methods to perform undesirably particularly when the training data is highly imbalanced. In this paper, we extend the state-of-the-art SDR method called leastsquares gradients for dimension reduction (LSGDR) to be able to cope with such class-prior change under the semi-supervised learning setup where unlabeled test data are available in addition to labeled training data. Through experiments, we demonstrate the usefulness of our proposed method.