{"title":"基于特征感知成本敏感标签嵌入的多标签分类","authors":"Hsien-Chun Chiu, Hsuan-Tien Lin","doi":"10.1109/TAAI.2018.00018","DOIUrl":null,"url":null,"abstract":"Multi-label classification (MLC) is an important learning problem where each instance is annotated with multiple labels. Label embedding (LE) is an important family of methods for MLC that extracts and utilizes the latent structure of labels towards better performance. Within the family, feature-aware LE methods, which jointly consider the feature and label information during extraction, have been shown to reach better performance than feature-unaware ones. Nevertheless, current feature-aware LE methods are not designed to flexibly adapt to different evaluation criteria. In this work, we propose a novel feature-aware LE method that takes the desired evaluation criterion (cost) into account during training. The method, named Feature-aware Cost-sensitive Label Embedding (FaCLE), encodes the criterion into the distance between embedded vectors with a deep Siamese network. The feature-aware characteristic of FaCLE is achieved with a loss function that jointly considers the embedding error and the feature-to-embedding error. Moreover, FaCLE is coupled with an additional-bit trick to deal with the possibly asymmetric criteria. Experiment results across different data sets and evaluation criteria demonstrate that FaCLE is superior to other state-of-the-art feature-aware LE methods and competitive to cost-sensitive LE methods.","PeriodicalId":211734,"journal":{"name":"2018 Conference on Technologies and Applications of Artificial Intelligence (TAAI)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Multi-Label Classification with Feature-Aware Cost-Sensitive Label Embedding\",\"authors\":\"Hsien-Chun Chiu, Hsuan-Tien Lin\",\"doi\":\"10.1109/TAAI.2018.00018\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Multi-label classification (MLC) is an important learning problem where each instance is annotated with multiple labels. Label embedding (LE) is an important family of methods for MLC that extracts and utilizes the latent structure of labels towards better performance. Within the family, feature-aware LE methods, which jointly consider the feature and label information during extraction, have been shown to reach better performance than feature-unaware ones. Nevertheless, current feature-aware LE methods are not designed to flexibly adapt to different evaluation criteria. In this work, we propose a novel feature-aware LE method that takes the desired evaluation criterion (cost) into account during training. The method, named Feature-aware Cost-sensitive Label Embedding (FaCLE), encodes the criterion into the distance between embedded vectors with a deep Siamese network. The feature-aware characteristic of FaCLE is achieved with a loss function that jointly considers the embedding error and the feature-to-embedding error. Moreover, FaCLE is coupled with an additional-bit trick to deal with the possibly asymmetric criteria. Experiment results across different data sets and evaluation criteria demonstrate that FaCLE is superior to other state-of-the-art feature-aware LE methods and competitive to cost-sensitive LE methods.\",\"PeriodicalId\":211734,\"journal\":{\"name\":\"2018 Conference on Technologies and Applications of Artificial Intelligence (TAAI)\",\"volume\":\"7 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 Conference on Technologies and Applications of Artificial Intelligence (TAAI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TAAI.2018.00018\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 Conference on Technologies and Applications of Artificial Intelligence (TAAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TAAI.2018.00018","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multi-Label Classification with Feature-Aware Cost-Sensitive Label Embedding
Multi-label classification (MLC) is an important learning problem where each instance is annotated with multiple labels. Label embedding (LE) is an important family of methods for MLC that extracts and utilizes the latent structure of labels towards better performance. Within the family, feature-aware LE methods, which jointly consider the feature and label information during extraction, have been shown to reach better performance than feature-unaware ones. Nevertheless, current feature-aware LE methods are not designed to flexibly adapt to different evaluation criteria. In this work, we propose a novel feature-aware LE method that takes the desired evaluation criterion (cost) into account during training. The method, named Feature-aware Cost-sensitive Label Embedding (FaCLE), encodes the criterion into the distance between embedded vectors with a deep Siamese network. The feature-aware characteristic of FaCLE is achieved with a loss function that jointly considers the embedding error and the feature-to-embedding error. Moreover, FaCLE is coupled with an additional-bit trick to deal with the possibly asymmetric criteria. Experiment results across different data sets and evaluation criteria demonstrate that FaCLE is superior to other state-of-the-art feature-aware LE methods and competitive to cost-sensitive LE methods.