估计多标签学习的潜在相对标签重要性

Shuo He, Lei Feng, Li Li
{"title":"估计多标签学习的潜在相对标签重要性","authors":"Shuo He, Lei Feng, Li Li","doi":"10.1109/ICDM.2018.00127","DOIUrl":null,"url":null,"abstract":"In multi-label learning, each instance is associated with multiple labels simultaneously. Most of the existing approaches directly treat each label in a crisp manner, i.e. one class label is either relevant or irrelevant to the instance. However, the latent relative importance of each relevant label is regrettably ignored. In this paper, we propose a novel multi-label learning approach that aims to estimate the latent labeling importances while training the inductive model simultaneously. Specifically, we present a biconvex formulation with both instance and label graph regularization, and solve this problem using an alternating way. On the one hand, the inductive model is trained by minimizing the least squares loss of fitting the latent relative labeling importances. On the other hand, the latent relative labeling importances are estimated by the modeling outputs via a specially constrained label propagation procedure. Through the mutual adaption of the inductive model training and the specially constrained label propagation, an effective multi-label learning model is therefore built by optimally estimating the latent relative labeling importances. Extensive experimental results clearly show the effectiveness of the proposed approach.","PeriodicalId":286444,"journal":{"name":"2018 IEEE International Conference on Data Mining (ICDM)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Estimating Latent Relative Labeling Importances for Multi-label Learning\",\"authors\":\"Shuo He, Lei Feng, Li Li\",\"doi\":\"10.1109/ICDM.2018.00127\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In multi-label learning, each instance is associated with multiple labels simultaneously. Most of the existing approaches directly treat each label in a crisp manner, i.e. one class label is either relevant or irrelevant to the instance. However, the latent relative importance of each relevant label is regrettably ignored. In this paper, we propose a novel multi-label learning approach that aims to estimate the latent labeling importances while training the inductive model simultaneously. Specifically, we present a biconvex formulation with both instance and label graph regularization, and solve this problem using an alternating way. On the one hand, the inductive model is trained by minimizing the least squares loss of fitting the latent relative labeling importances. On the other hand, the latent relative labeling importances are estimated by the modeling outputs via a specially constrained label propagation procedure. Through the mutual adaption of the inductive model training and the specially constrained label propagation, an effective multi-label learning model is therefore built by optimally estimating the latent relative labeling importances. Extensive experimental results clearly show the effectiveness of the proposed approach.\",\"PeriodicalId\":286444,\"journal\":{\"name\":\"2018 IEEE International Conference on Data Mining (ICDM)\",\"volume\":\"17 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 IEEE International Conference on Data Mining (ICDM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICDM.2018.00127\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE International Conference on Data Mining (ICDM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICDM.2018.00127","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

摘要

在多标签学习中,每个实例同时与多个标签相关联。大多数现有方法直接以一种清晰的方式处理每个标签,即一个类标签与实例相关或不相关。然而,令人遗憾的是,每个相关标签的潜在相对重要性被忽视了。在本文中,我们提出了一种新的多标签学习方法,旨在在训练归纳模型的同时估计潜在标签重要性。具体来说,我们提出了一个实例图正则化和标签图正则化的双凸公式,并用交替的方法解决了这个问题。一方面,通过最小化拟合潜在相对标记重要性的最小二乘损失来训练归纳模型。另一方面,通过一个特殊约束的标签传播过程,通过建模输出来估计潜在的相对标签重要性。通过归纳模型训练和特殊约束标签传播的相互适应,通过最优估计潜在相对标签重要性构建有效的多标签学习模型。大量的实验结果清楚地表明了该方法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Estimating Latent Relative Labeling Importances for Multi-label Learning
In multi-label learning, each instance is associated with multiple labels simultaneously. Most of the existing approaches directly treat each label in a crisp manner, i.e. one class label is either relevant or irrelevant to the instance. However, the latent relative importance of each relevant label is regrettably ignored. In this paper, we propose a novel multi-label learning approach that aims to estimate the latent labeling importances while training the inductive model simultaneously. Specifically, we present a biconvex formulation with both instance and label graph regularization, and solve this problem using an alternating way. On the one hand, the inductive model is trained by minimizing the least squares loss of fitting the latent relative labeling importances. On the other hand, the latent relative labeling importances are estimated by the modeling outputs via a specially constrained label propagation procedure. Through the mutual adaption of the inductive model training and the specially constrained label propagation, an effective multi-label learning model is therefore built by optimally estimating the latent relative labeling importances. Extensive experimental results clearly show the effectiveness of the proposed approach.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Entire Regularization Path for Sparse Nonnegative Interaction Model Accelerating Experimental Design by Incorporating Experimenter Hunches Title Page i An Efficient Many-Class Active Learning Framework for Knowledge-Rich Domains Social Recommendation with Missing Not at Random Data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1