引导式对比领域适应

IF 3.3 2区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Memetic Computing Pub Date : 2024-08-13 DOI:10.1007/s12293-024-00422-6
Yan Jia, Yuqing Cheng, Peng Qiao
{"title":"引导式对比领域适应","authors":"Yan Jia, Yuqing Cheng, Peng Qiao","doi":"10.1007/s12293-024-00422-6","DOIUrl":null,"url":null,"abstract":"<p>Self-supervised learning, particularly through contrastive learning, has shown significant promise in vision tasks. Although effective, contrastive learning faces the issue of false negatives, particularly under domain shifts in domain adaptation scenarios. The Bootstrap Your Own Latent approach, with its asymmetric structure and avoidance of unnecessary negative samples, offers a foundation to address this issue, which remains underexplored in domain adaptation. We introduce an asymmetrically structured network, the Bootstrap Contrastive Domain Adaptation (BCDA), that innovatively applies contrastive learning to domain adaptation. BCDA utilizes a bootstrap clustering positive sampling strategy to ensure stable, end-to-end domain adaptation, preventing model collapse often seen in asymmetric networks. This method not only aligns domains internally through mean square loss but also enhances semantic inter-domain alignment, effectively eliminating false negatives. Our approach, BCDA, represents the first foray into non-contrastive domain adaptation and could serve as a foundational model for future studies. It shows potential to supersede contrastive domain adaptation methods in eliminating false negatives, evidenced by high-level results on three well-known domain adaptation benchmark datasets.</p>","PeriodicalId":48780,"journal":{"name":"Memetic Computing","volume":null,"pages":null},"PeriodicalIF":3.3000,"publicationDate":"2024-08-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bootstrap contrastive domain adaptation\",\"authors\":\"Yan Jia, Yuqing Cheng, Peng Qiao\",\"doi\":\"10.1007/s12293-024-00422-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Self-supervised learning, particularly through contrastive learning, has shown significant promise in vision tasks. Although effective, contrastive learning faces the issue of false negatives, particularly under domain shifts in domain adaptation scenarios. The Bootstrap Your Own Latent approach, with its asymmetric structure and avoidance of unnecessary negative samples, offers a foundation to address this issue, which remains underexplored in domain adaptation. We introduce an asymmetrically structured network, the Bootstrap Contrastive Domain Adaptation (BCDA), that innovatively applies contrastive learning to domain adaptation. BCDA utilizes a bootstrap clustering positive sampling strategy to ensure stable, end-to-end domain adaptation, preventing model collapse often seen in asymmetric networks. This method not only aligns domains internally through mean square loss but also enhances semantic inter-domain alignment, effectively eliminating false negatives. Our approach, BCDA, represents the first foray into non-contrastive domain adaptation and could serve as a foundational model for future studies. It shows potential to supersede contrastive domain adaptation methods in eliminating false negatives, evidenced by high-level results on three well-known domain adaptation benchmark datasets.</p>\",\"PeriodicalId\":48780,\"journal\":{\"name\":\"Memetic Computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2024-08-13\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Memetic Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s12293-024-00422-6\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Memetic Computing","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s12293-024-00422-6","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

自我监督学习,特别是通过对比学习,在视觉任务中显示出了巨大的前景。对比学习虽然有效,但也面临着假阴性的问题,尤其是在领域适应场景中的领域转移情况下。Bootstrap Your Own Latent 方法采用非对称结构,避免了不必要的负样本,为解决这一问题提供了基础,而这一问题在领域适应中仍未得到充分探索。我们引入了一种非对称结构网络--自举对比领域适应(BCDA),它创新性地将对比学习应用于领域适应。BCDA 采用自举聚类正向采样策略,确保稳定的端到端领域适应,防止非对称网络中常见的模型崩溃。这种方法不仅能通过均方损失实现域内部对齐,还能增强域间语义对齐,有效消除假阴性。我们的 BCDA 方法是对非对比域适应性的首次尝试,可作为未来研究的基础模型。在三个著名的领域适应基准数据集上取得的高水平结果表明,它在消除假否定方面具有超越对比领域适应方法的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Bootstrap contrastive domain adaptation

Self-supervised learning, particularly through contrastive learning, has shown significant promise in vision tasks. Although effective, contrastive learning faces the issue of false negatives, particularly under domain shifts in domain adaptation scenarios. The Bootstrap Your Own Latent approach, with its asymmetric structure and avoidance of unnecessary negative samples, offers a foundation to address this issue, which remains underexplored in domain adaptation. We introduce an asymmetrically structured network, the Bootstrap Contrastive Domain Adaptation (BCDA), that innovatively applies contrastive learning to domain adaptation. BCDA utilizes a bootstrap clustering positive sampling strategy to ensure stable, end-to-end domain adaptation, preventing model collapse often seen in asymmetric networks. This method not only aligns domains internally through mean square loss but also enhances semantic inter-domain alignment, effectively eliminating false negatives. Our approach, BCDA, represents the first foray into non-contrastive domain adaptation and could serve as a foundational model for future studies. It shows potential to supersede contrastive domain adaptation methods in eliminating false negatives, evidenced by high-level results on three well-known domain adaptation benchmark datasets.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Memetic Computing
Memetic Computing COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-OPERATIONS RESEARCH & MANAGEMENT SCIENCE
CiteScore
6.80
自引率
12.80%
发文量
31
期刊介绍: Memes have been defined as basic units of transferrable information that reside in the brain and are propagated across populations through the process of imitation. From an algorithmic point of view, memes have come to be regarded as building-blocks of prior knowledge, expressed in arbitrary computational representations (e.g., local search heuristics, fuzzy rules, neural models, etc.), that have been acquired through experience by a human or machine, and can be imitated (i.e., reused) across problems. The Memetic Computing journal welcomes papers incorporating the aforementioned socio-cultural notion of memes into artificial systems, with particular emphasis on enhancing the efficacy of computational and artificial intelligence techniques for search, optimization, and machine learning through explicit prior knowledge incorporation. The goal of the journal is to thus be an outlet for high quality theoretical and applied research on hybrid, knowledge-driven computational approaches that may be characterized under any of the following categories of memetics: Type 1: General-purpose algorithms integrated with human-crafted heuristics that capture some form of prior domain knowledge; e.g., traditional memetic algorithms hybridizing evolutionary global search with a problem-specific local search. Type 2: Algorithms with the ability to automatically select, adapt, and reuse the most appropriate heuristics from a diverse pool of available choices; e.g., learning a mapping between global search operators and multiple local search schemes, given an optimization problem at hand. Type 3: Algorithms that autonomously learn with experience, adaptively reusing data and/or machine learning models drawn from related problems as prior knowledge in new target tasks of interest; examples include, but are not limited to, transfer learning and optimization, multi-task learning and optimization, or any other multi-X evolutionary learning and optimization methodologies.
期刊最新文献
ResGAT: Residual Graph Attention Networks for molecular property prediction Enhancing online yard crane scheduling through a two-stage rollout memetic genetic programming Proximal evolutionary strategy: improving deep reinforcement learning through evolutionary policy optimization Where does the crude oil originate? The role of near-infrared spectroscopy in accurate source detection Bootstrap contrastive domain adaptation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1