Generating visual representations for zero-shot learning via adversarial learning and variational autoencoders

IF 2.4 4区 计算机科学 Q2 COMPUTER SCIENCE, THEORY & METHODS International Journal of General Systems Pub Date : 2023-05-01 DOI:10.1080/03081079.2023.2199991
M. Gull, Omar Arif
{"title":"Generating visual representations for zero-shot learning via adversarial learning and variational autoencoders","authors":"M. Gull, Omar Arif","doi":"10.1080/03081079.2023.2199991","DOIUrl":null,"url":null,"abstract":"Computer vision tasks rely heavily on a huge amount of training data for classification, but in everyday situations, it is impossible to assemble a large amount of training data. Zero-shot learning (ZSL) is a promising domain for the applications in which we have no labeled data available for novel classes. It aims to recognize those unseen classes, by transferring semantic information from seen to unseen classes. In this paper, we propose a generative approach for generalized ZSL that combines the strength of Conditional Variational Autoencoder (CVAE) and Conditional Generative Adversarial Network (CGAN). The key to our approach is synthesizing visual features by including a Regressor that works on cycle-consistency loss, which will constrain the whole generative process. For experimental purposes, four challenging data sets, i.e. CUB, AWA1, AWA2 and SUN, are used in both conventional and generalized settings. Our proposed approach achieves significantly better results on these standard datasets in both settings.","PeriodicalId":50322,"journal":{"name":"International Journal of General Systems","volume":"52 1","pages":"636 - 651"},"PeriodicalIF":2.4000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of General Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1080/03081079.2023.2199991","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

Computer vision tasks rely heavily on a huge amount of training data for classification, but in everyday situations, it is impossible to assemble a large amount of training data. Zero-shot learning (ZSL) is a promising domain for the applications in which we have no labeled data available for novel classes. It aims to recognize those unseen classes, by transferring semantic information from seen to unseen classes. In this paper, we propose a generative approach for generalized ZSL that combines the strength of Conditional Variational Autoencoder (CVAE) and Conditional Generative Adversarial Network (CGAN). The key to our approach is synthesizing visual features by including a Regressor that works on cycle-consistency loss, which will constrain the whole generative process. For experimental purposes, four challenging data sets, i.e. CUB, AWA1, AWA2 and SUN, are used in both conventional and generalized settings. Our proposed approach achieves significantly better results on these standard datasets in both settings.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过对抗性学习和变分自动编码器生成零样本学习的视觉表示
计算机视觉任务在很大程度上依赖于大量的训练数据进行分类,但在日常情况下,不可能收集大量的训练信息。零样本学习(ZSL)是一个很有前途的领域,在这些领域中,我们没有可用于新类的标记数据。它旨在通过将语义信息从可见类传递到不可见类来识别那些不可见的类。在本文中,我们提出了一种广义ZSL的生成方法,该方法结合了条件变分自动编码器(CVAE)和条件生成对抗网络(CGAN)的优点。我们方法的关键是通过包含一个回归器来合成视觉特征,该回归器处理循环一致性损失,这将约束整个生成过程。出于实验目的,在常规和通用设置中都使用了四个具有挑战性的数据集,即CUB、AWA1、AWA2和SUN。我们提出的方法在这两种情况下都能在这些标准数据集上获得明显更好的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
International Journal of General Systems
International Journal of General Systems 工程技术-计算机:理论方法
CiteScore
4.10
自引率
20.00%
发文量
38
审稿时长
6 months
期刊介绍: International Journal of General Systems is a periodical devoted primarily to the publication of original research contributions to system science, basic as well as applied. However, relevant survey articles, invited book reviews, bibliographies, and letters to the editor are also published. The principal aim of the journal is to promote original systems ideas (concepts, principles, methods, theoretical or experimental results, etc.) that are broadly applicable to various kinds of systems. The term “general system” in the name of the journal is intended to indicate this aim–the orientation to systems ideas that have a general applicability. Typical subject areas covered by the journal include: uncertainty and randomness; fuzziness and imprecision; information; complexity; inductive and deductive reasoning about systems; learning; systems analysis and design; and theoretical as well as experimental knowledge regarding various categories of systems. Submitted research must be well presented and must clearly state the contribution and novelty. Manuscripts dealing with particular kinds of systems which lack general applicability across a broad range of systems should be sent to journals specializing in the respective topics.
期刊最新文献
Stress–strength reliability estimation of s-out-of-k multicomponent systems based on copula function for dependent strength elements under progressively censored sample Reliability of a consecutive k-out-of-n: G system with protection blocks Two-way concept-cognitive learning method: a perspective from progressive learning of fuzzy skills Disturbance-observer-based adaptive neural event-triggered fault-tolerant control for uncertain nonlinear systems against sensor faults Idempotent uninorms on bounded lattices with at most a single point incomparable with the neutral element: Part II
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1