ASCL: 通过对比学习加速半监督学习

IF 1.5 4区 计算机科学 Q3 COMPUTER SCIENCE, SOFTWARE ENGINEERING Concurrency and Computation-Practice & Experience Pub Date : 2024-10-08 DOI:10.1002/cpe.8293
Haixiong Liu, Zuoyong Li, Jiawei Wu, Kun Zeng, Rong Hu, Wei Zeng
{"title":"ASCL: 通过对比学习加速半监督学习","authors":"Haixiong Liu,&nbsp;Zuoyong Li,&nbsp;Jiawei Wu,&nbsp;Kun Zeng,&nbsp;Rong Hu,&nbsp;Wei Zeng","doi":"10.1002/cpe.8293","DOIUrl":null,"url":null,"abstract":"<div>\n \n <p>SSL (semi-supervised learning) is widely used in machine learning, which leverages labeled and unlabeled data to improve model performance. SSL aims to optimize class mutual information, but noisy pseudo-labels introduce false class information due to the scarcity of labels. Therefore, these algorithms often need significant training time to refine pseudo-labels for performance improvement iteratively. To tackle this challenge, we propose a novel plug-and-play method named Accelerating semi-supervised learning via contrastive learning (ASCL). This method combines contrastive learning with uncertainty-based selection for performance improvement and accelerates the convergence of SSL algorithms. Contrastive learning initially emphasizes the mutual information between samples as a means to decrease dependence on pseudo-labels. Subsequently, it gradually turns to maximizing the mutual information between classes, aligning with the objective of semi-supervised learning. Uncertainty-based selection provides a robust mechanism for acquiring pseudo-labels. The combination of the contrastive learning module and the uncertainty-based selection module forms a virtuous cycle to improve the performance of the proposed model. Extensive experiments demonstrate that ASCL outperforms state-of-the-art methods in terms of both convergence efficiency and performance. In the experimental scenario where only one label is assigned per class in the CIFAR-10 dataset, the application of ASCL to Pseudo-label, UDA (unsupervised data augmentation for consistency training), and Fixmatch benefits substantial improvements in classification accuracy. Specifically, the results demonstrate notable improvements in respect of 16.32%, 6.9%, and 24.43% when compared to the original outcomes. Moreover, the required training time is reduced by almost 50%.</p>\n </div>","PeriodicalId":55214,"journal":{"name":"Concurrency and Computation-Practice & Experience","volume":"36 28","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2024-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"ASCL: Accelerating semi-supervised learning via contrastive learning\",\"authors\":\"Haixiong Liu,&nbsp;Zuoyong Li,&nbsp;Jiawei Wu,&nbsp;Kun Zeng,&nbsp;Rong Hu,&nbsp;Wei Zeng\",\"doi\":\"10.1002/cpe.8293\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div>\\n \\n <p>SSL (semi-supervised learning) is widely used in machine learning, which leverages labeled and unlabeled data to improve model performance. SSL aims to optimize class mutual information, but noisy pseudo-labels introduce false class information due to the scarcity of labels. Therefore, these algorithms often need significant training time to refine pseudo-labels for performance improvement iteratively. To tackle this challenge, we propose a novel plug-and-play method named Accelerating semi-supervised learning via contrastive learning (ASCL). This method combines contrastive learning with uncertainty-based selection for performance improvement and accelerates the convergence of SSL algorithms. Contrastive learning initially emphasizes the mutual information between samples as a means to decrease dependence on pseudo-labels. Subsequently, it gradually turns to maximizing the mutual information between classes, aligning with the objective of semi-supervised learning. Uncertainty-based selection provides a robust mechanism for acquiring pseudo-labels. The combination of the contrastive learning module and the uncertainty-based selection module forms a virtuous cycle to improve the performance of the proposed model. Extensive experiments demonstrate that ASCL outperforms state-of-the-art methods in terms of both convergence efficiency and performance. In the experimental scenario where only one label is assigned per class in the CIFAR-10 dataset, the application of ASCL to Pseudo-label, UDA (unsupervised data augmentation for consistency training), and Fixmatch benefits substantial improvements in classification accuracy. Specifically, the results demonstrate notable improvements in respect of 16.32%, 6.9%, and 24.43% when compared to the original outcomes. Moreover, the required training time is reduced by almost 50%.</p>\\n </div>\",\"PeriodicalId\":55214,\"journal\":{\"name\":\"Concurrency and Computation-Practice & Experience\",\"volume\":\"36 28\",\"pages\":\"\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2024-10-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Concurrency and Computation-Practice & Experience\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/cpe.8293\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Concurrency and Computation-Practice & Experience","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/cpe.8293","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

摘要

SSL(半监督学习)被广泛应用于机器学习中,它利用有标签和无标签数据来提高模型性能。SSL 的目标是优化类互信息,但由于标签的稀缺性,嘈杂的伪标签会带来虚假的类信息。因此,这些算法往往需要大量的训练时间来反复改进伪标签以提高性能。为了应对这一挑战,我们提出了一种名为 "通过对比学习加速半监督学习(Accelerating semi-supervised learning via contrastive learning,ASCL)"的即插即用新方法。这种方法将对比学习与基于不确定性的选择相结合,以提高性能并加速 SSL 算法的收敛。对比学习最初强调样本间的互信息,以此来减少对伪标签的依赖。随后,它逐渐转向最大化类之间的互信息,这与半监督学习的目标一致。基于不确定性的选择为获取伪标签提供了一种稳健的机制。对比学习模块和基于不确定性的选择模块相结合,形成了一个良性循环,从而提高了所提模型的性能。大量实验证明,ASCL 在收敛效率和性能方面都优于最先进的方法。在 CIFAR-10 数据集中,每个类别只分配了一个标签,在这种实验场景下,将 ASCL 应用于伪标签、UDA(用于一致性训练的无监督数据增强)和 Fixmatch 可显著提高分类准确性。具体来说,与原始结果相比,结果显示分类准确率分别提高了 16.32%、6.9% 和 24.43%。此外,所需的训练时间也减少了近 50%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
ASCL: Accelerating semi-supervised learning via contrastive learning

SSL (semi-supervised learning) is widely used in machine learning, which leverages labeled and unlabeled data to improve model performance. SSL aims to optimize class mutual information, but noisy pseudo-labels introduce false class information due to the scarcity of labels. Therefore, these algorithms often need significant training time to refine pseudo-labels for performance improvement iteratively. To tackle this challenge, we propose a novel plug-and-play method named Accelerating semi-supervised learning via contrastive learning (ASCL). This method combines contrastive learning with uncertainty-based selection for performance improvement and accelerates the convergence of SSL algorithms. Contrastive learning initially emphasizes the mutual information between samples as a means to decrease dependence on pseudo-labels. Subsequently, it gradually turns to maximizing the mutual information between classes, aligning with the objective of semi-supervised learning. Uncertainty-based selection provides a robust mechanism for acquiring pseudo-labels. The combination of the contrastive learning module and the uncertainty-based selection module forms a virtuous cycle to improve the performance of the proposed model. Extensive experiments demonstrate that ASCL outperforms state-of-the-art methods in terms of both convergence efficiency and performance. In the experimental scenario where only one label is assigned per class in the CIFAR-10 dataset, the application of ASCL to Pseudo-label, UDA (unsupervised data augmentation for consistency training), and Fixmatch benefits substantial improvements in classification accuracy. Specifically, the results demonstrate notable improvements in respect of 16.32%, 6.9%, and 24.43% when compared to the original outcomes. Moreover, the required training time is reduced by almost 50%.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Concurrency and Computation-Practice & Experience
Concurrency and Computation-Practice & Experience 工程技术-计算机:理论方法
CiteScore
5.00
自引率
10.00%
发文量
664
审稿时长
9.6 months
期刊介绍: Concurrency and Computation: Practice and Experience (CCPE) publishes high-quality, original research papers, and authoritative research review papers, in the overlapping fields of: Parallel and distributed computing; High-performance computing; Computational and data science; Artificial intelligence and machine learning; Big data applications, algorithms, and systems; Network science; Ontologies and semantics; Security and privacy; Cloud/edge/fog computing; Green computing; and Quantum computing.
期刊最新文献
A Dynamic Energy-Efficient Scheduling Method for Periodic Workflows Based on Collaboration of Edge-Cloud Computing Resources An Innovative Performance Assessment Method for Increasing the Efficiency of AODV Routing Protocol in VANETs Through Colored Timed Petri Nets YOLOv8-ESW: An Improved Oncomelania hupensis Detection Model Three Party Post Quantum Secure Lattice Based Construction of Authenticated Key Establishment Protocol for Mobile Communication Unstructured Text Data Security Attribute Mining Method Based on Multi-Model Collaboration
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1