{"title":"标签相关/不相关主题切换模型:处理无限标签无关主题的部分标记主题模型","authors":"Yasutoshi Ida, Takuma Nakamura, Takashi Matsumoto","doi":"10.1109/ACPR.2013.163","DOIUrl":null,"url":null,"abstract":"We propose a Label-Related/Unrelated Topic Switching Model (LRU-TSM) based on Latent Dirichlet Allocation (LDA) for modeling a labeled corpus. In this model, each word is allocated to a label-related topic or a label-unrelated topic. Label-related topics utilize label information, and label-unrelated topics utilize the framework of Bayesian Nonparametrics, which can estimate the number of topics in posterior distributions. Our model handles label-related and -unrelated topics explicitly, in contrast to the earlier model, and improves the performances of applications to which is applied. Using real-world datasets, we show that our model outperforms the earlier model in terms of perplexity and efficiency for label prediction tasks that involve predicting labels for documents or pictures without labels.","PeriodicalId":365633,"journal":{"name":"2013 2nd IAPR Asian Conference on Pattern Recognition","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-11-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Label-Related/Unrelated Topic Switching Model: A Partially Labeled Topic Model Handling Infinite Label-Unrelated Topics\",\"authors\":\"Yasutoshi Ida, Takuma Nakamura, Takashi Matsumoto\",\"doi\":\"10.1109/ACPR.2013.163\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We propose a Label-Related/Unrelated Topic Switching Model (LRU-TSM) based on Latent Dirichlet Allocation (LDA) for modeling a labeled corpus. In this model, each word is allocated to a label-related topic or a label-unrelated topic. Label-related topics utilize label information, and label-unrelated topics utilize the framework of Bayesian Nonparametrics, which can estimate the number of topics in posterior distributions. Our model handles label-related and -unrelated topics explicitly, in contrast to the earlier model, and improves the performances of applications to which is applied. Using real-world datasets, we show that our model outperforms the earlier model in terms of perplexity and efficiency for label prediction tasks that involve predicting labels for documents or pictures without labels.\",\"PeriodicalId\":365633,\"journal\":{\"name\":\"2013 2nd IAPR Asian Conference on Pattern Recognition\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-11-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 2nd IAPR Asian Conference on Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ACPR.2013.163\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 2nd IAPR Asian Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACPR.2013.163","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Label-Related/Unrelated Topic Switching Model: A Partially Labeled Topic Model Handling Infinite Label-Unrelated Topics
We propose a Label-Related/Unrelated Topic Switching Model (LRU-TSM) based on Latent Dirichlet Allocation (LDA) for modeling a labeled corpus. In this model, each word is allocated to a label-related topic or a label-unrelated topic. Label-related topics utilize label information, and label-unrelated topics utilize the framework of Bayesian Nonparametrics, which can estimate the number of topics in posterior distributions. Our model handles label-related and -unrelated topics explicitly, in contrast to the earlier model, and improves the performances of applications to which is applied. Using real-world datasets, we show that our model outperforms the earlier model in terms of perplexity and efficiency for label prediction tasks that involve predicting labels for documents or pictures without labels.