利用大余量解决有偏差的互补标签学习问题

IF 8.1 1区 计算机科学 0 COMPUTER SCIENCE, INFORMATION SYSTEMS Information Sciences Pub Date : 2024-08-28 DOI:10.1016/j.ins.2024.121400
{"title":"利用大余量解决有偏差的互补标签学习问题","authors":"","doi":"10.1016/j.ins.2024.121400","DOIUrl":null,"url":null,"abstract":"<div><p>Complementary Label Learning (CLL) is a typical weakly supervised learning protocol, where each instance is associated with one complementary label to specify a class that the instance does not belong to. Current CLL approaches assume that complementary labels are uniformly sampled from all non-ground-truth labels, so as to implicitly and locally share complementary labels by solely reducing the logit of complementary label in one way or another. In this paper, we point out that, when the uniform assumption does not hold, existing CLL methods are weakened their ability to share complementary labels and fail in creating classifiers with large logit margin (LM), resulting in a significant performance drop. To address these issues, we instead present complementary logit margin (CLM) and empirically prove that increasing CLM contributes to the share of complementary labels under the biased CLL setting. Accordingly, we propose a surrogate complementary one-versus-rest loss (COVR) and demonstrate that optimization on COVR can effectively increase CLM with both theoretical and empirical evidences. Extensive experiments verify that the proposed COVR exhibits substantial improvement for both the biased CLL and even a more practical CLL setting: instance-dependent complementary label learning.</p></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":null,"pages":null},"PeriodicalIF":8.1000,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Tackling biased complementary label learning with large margin\",\"authors\":\"\",\"doi\":\"10.1016/j.ins.2024.121400\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Complementary Label Learning (CLL) is a typical weakly supervised learning protocol, where each instance is associated with one complementary label to specify a class that the instance does not belong to. Current CLL approaches assume that complementary labels are uniformly sampled from all non-ground-truth labels, so as to implicitly and locally share complementary labels by solely reducing the logit of complementary label in one way or another. In this paper, we point out that, when the uniform assumption does not hold, existing CLL methods are weakened their ability to share complementary labels and fail in creating classifiers with large logit margin (LM), resulting in a significant performance drop. To address these issues, we instead present complementary logit margin (CLM) and empirically prove that increasing CLM contributes to the share of complementary labels under the biased CLL setting. Accordingly, we propose a surrogate complementary one-versus-rest loss (COVR) and demonstrate that optimization on COVR can effectively increase CLM with both theoretical and empirical evidences. Extensive experiments verify that the proposed COVR exhibits substantial improvement for both the biased CLL and even a more practical CLL setting: instance-dependent complementary label learning.</p></div>\",\"PeriodicalId\":51063,\"journal\":{\"name\":\"Information Sciences\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":8.1000,\"publicationDate\":\"2024-08-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Information Sciences\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0020025524013148\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"0\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025524013148","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

互补标签学习(CLL)是一种典型的弱监督学习协议,每个实例都与一个互补标签相关联,以指定该实例不属于的类别。目前的互补标签学习方法假定互补标签是从所有非地面真实标签中统一采样的,因此只需通过某种方式降低互补标签的对数,就能隐式地局部共享互补标签。本文指出,当均匀假设不成立时,现有的 CLL 方法共享互补标签的能力会被削弱,无法创建具有较大对数差(LM)的分类器,从而导致性能显著下降。为了解决这些问题,我们提出了互补对数边际(CLM),并通过经验证明,在有偏差的 CLL 设置下,增加 CLM 有助于提高互补标签的份额。因此,我们提出了一种替代的互补单对单损失(COVR),并通过理论和实证证明对 COVR 的优化可以有效提高 CLM。广泛的实验验证了所提出的 COVR 在有偏差的 CLL 以及更实用的 CLL 设置(依赖实例的互补标签学习)中都有显著的改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Tackling biased complementary label learning with large margin

Complementary Label Learning (CLL) is a typical weakly supervised learning protocol, where each instance is associated with one complementary label to specify a class that the instance does not belong to. Current CLL approaches assume that complementary labels are uniformly sampled from all non-ground-truth labels, so as to implicitly and locally share complementary labels by solely reducing the logit of complementary label in one way or another. In this paper, we point out that, when the uniform assumption does not hold, existing CLL methods are weakened their ability to share complementary labels and fail in creating classifiers with large logit margin (LM), resulting in a significant performance drop. To address these issues, we instead present complementary logit margin (CLM) and empirically prove that increasing CLM contributes to the share of complementary labels under the biased CLL setting. Accordingly, we propose a surrogate complementary one-versus-rest loss (COVR) and demonstrate that optimization on COVR can effectively increase CLM with both theoretical and empirical evidences. Extensive experiments verify that the proposed COVR exhibits substantial improvement for both the biased CLL and even a more practical CLL setting: instance-dependent complementary label learning.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Information Sciences
Information Sciences 工程技术-计算机:信息系统
CiteScore
14.00
自引率
17.30%
发文量
1322
审稿时长
10.4 months
期刊介绍: Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions. Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.
期刊最新文献
Wavelet structure-texture-aware super-resolution for pedestrian detection HVASR: Enhancing 360-degree video delivery with viewport-aware super resolution KNEG-CL: Unveiling data patterns using a k-nearest neighbor evolutionary graph for efficient clustering Fréchet and Gateaux gH-differentiability for interval valued functions of multiple variables Detecting fuzzy-rough conditional anomalies
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1