嵌入式感知生成模型的调查和实验研究:特征、模型和任意拍摄场景

IF 3.3 2区 计算机科学 Q2 AUTOMATION & CONTROL SYSTEMS Journal of Process Control Pub Date : 2024-09-21 DOI:10.1016/j.jprocont.2024.103297
{"title":"嵌入式感知生成模型的调查和实验研究:特征、模型和任意拍摄场景","authors":"","doi":"10.1016/j.jprocont.2024.103297","DOIUrl":null,"url":null,"abstract":"<div><p>In the era of industrial artificial intelligence, grappling with data insufficiency remains a formidable challenge that stands at the forefront of our progress. The embedding-aware generative model emerges as a promising solution, tackling this issue head-on in the realm of zero-shot learning by ingeniously constructing a generator that bridges the gap between semantic and feature spaces. Thanks to the predefined benchmark and protocols, the number of proposed embedding-aware generative models for zero-shot learning is increasing rapidly. We argue that it is time to take a step back and reconsider the embedding-aware generative paradigm. The main work of this paper is two-fold. First, embedding features in benchmark datasets are somehow overlooked, which potentially limits the performance of generative models, while most researchers focus on how to improve them. Therefore, we conduct a systematic evaluation of 10 representative embedding-aware generative models and prove that even simple representation modifications on the embedding features can improve the performance of generative models for zero-shot learning remarkably. So it is time to pay more attention to the current embedding features in benchmark datasets. Second, based on five benchmark datasets, each with six any-shot learning scenarios, we systematically compare the performance of ten typical embedding-aware generative models for the first time, and we give a strong baseline for zero-shot learning and few-shot learning. Meanwhile, a comprehensive generative model repository, namely, generative any-shot learning repository, is provided, which contains the models, features, parameters, and scenarios of embedding-aware generative models for zero-shot learning and few-shot learning. Any results in this paper can be readily reproduced with only one command line based on generative any-shot learning.</p></div>","PeriodicalId":50079,"journal":{"name":"Journal of Process Control","volume":null,"pages":null},"PeriodicalIF":3.3000,"publicationDate":"2024-09-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A survey and experimental study for embedding-aware generative models: Features, models, and any-shot scenarios\",\"authors\":\"\",\"doi\":\"10.1016/j.jprocont.2024.103297\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>In the era of industrial artificial intelligence, grappling with data insufficiency remains a formidable challenge that stands at the forefront of our progress. The embedding-aware generative model emerges as a promising solution, tackling this issue head-on in the realm of zero-shot learning by ingeniously constructing a generator that bridges the gap between semantic and feature spaces. Thanks to the predefined benchmark and protocols, the number of proposed embedding-aware generative models for zero-shot learning is increasing rapidly. We argue that it is time to take a step back and reconsider the embedding-aware generative paradigm. The main work of this paper is two-fold. First, embedding features in benchmark datasets are somehow overlooked, which potentially limits the performance of generative models, while most researchers focus on how to improve them. Therefore, we conduct a systematic evaluation of 10 representative embedding-aware generative models and prove that even simple representation modifications on the embedding features can improve the performance of generative models for zero-shot learning remarkably. So it is time to pay more attention to the current embedding features in benchmark datasets. Second, based on five benchmark datasets, each with six any-shot learning scenarios, we systematically compare the performance of ten typical embedding-aware generative models for the first time, and we give a strong baseline for zero-shot learning and few-shot learning. Meanwhile, a comprehensive generative model repository, namely, generative any-shot learning repository, is provided, which contains the models, features, parameters, and scenarios of embedding-aware generative models for zero-shot learning and few-shot learning. Any results in this paper can be readily reproduced with only one command line based on generative any-shot learning.</p></div>\",\"PeriodicalId\":50079,\"journal\":{\"name\":\"Journal of Process Control\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2024-09-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Process Control\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0959152424001379\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Process Control","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0959152424001379","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

在工业人工智能时代,解决数据不足的问题仍然是我们前进道路上的一项艰巨挑战。嵌入式感知生成模型是一种很有前途的解决方案,它通过巧妙地构建一种生成器,在语义空间和特征空间之间架起了一座桥梁,从而在零点学习领域迎头解决了这一问题。得益于预定义的基准和协议,针对零点学习提出的嵌入感知生成模型的数量正在迅速增加。我们认为,现在是退一步重新考虑嵌入感知生成模型的时候了。本文的主要工作有两方面。首先,基准数据集中的嵌入特征在某种程度上被忽视了,这可能会限制生成模型的性能,而大多数研究人员则专注于如何改进生成模型。因此,我们对 10 个具有代表性的嵌入感知生成模型进行了系统评估,证明即使对嵌入特征进行简单的表示修改,也能显著提高生成模型的零点学习性能。因此,是时候对基准数据集中的当前嵌入特征给予更多关注了。其次,基于五个基准数据集,每个数据集有六种任意拍摄学习场景,我们首次系统地比较了十种典型的嵌入感知生成模型的性能,并给出了零拍摄学习和少拍摄学习的有力基准。同时,我们还提供了一个全面的生成模型库,即任意拍摄学习生成模型库,其中包含零拍摄学习和少拍摄学习的嵌入感知生成模型的模型、特征、参数和场景。本文中的任何结果都可以在任意生成学习的基础上通过一条命令行轻松重现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A survey and experimental study for embedding-aware generative models: Features, models, and any-shot scenarios

In the era of industrial artificial intelligence, grappling with data insufficiency remains a formidable challenge that stands at the forefront of our progress. The embedding-aware generative model emerges as a promising solution, tackling this issue head-on in the realm of zero-shot learning by ingeniously constructing a generator that bridges the gap between semantic and feature spaces. Thanks to the predefined benchmark and protocols, the number of proposed embedding-aware generative models for zero-shot learning is increasing rapidly. We argue that it is time to take a step back and reconsider the embedding-aware generative paradigm. The main work of this paper is two-fold. First, embedding features in benchmark datasets are somehow overlooked, which potentially limits the performance of generative models, while most researchers focus on how to improve them. Therefore, we conduct a systematic evaluation of 10 representative embedding-aware generative models and prove that even simple representation modifications on the embedding features can improve the performance of generative models for zero-shot learning remarkably. So it is time to pay more attention to the current embedding features in benchmark datasets. Second, based on five benchmark datasets, each with six any-shot learning scenarios, we systematically compare the performance of ten typical embedding-aware generative models for the first time, and we give a strong baseline for zero-shot learning and few-shot learning. Meanwhile, a comprehensive generative model repository, namely, generative any-shot learning repository, is provided, which contains the models, features, parameters, and scenarios of embedding-aware generative models for zero-shot learning and few-shot learning. Any results in this paper can be readily reproduced with only one command line based on generative any-shot learning.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Process Control
Journal of Process Control 工程技术-工程:化工
CiteScore
7.00
自引率
11.90%
发文量
159
审稿时长
74 days
期刊介绍: This international journal covers the application of control theory, operations research, computer science and engineering principles to the solution of process control problems. In addition to the traditional chemical processing and manufacturing applications, the scope of process control problems involves a wide range of applications that includes energy processes, nano-technology, systems biology, bio-medical engineering, pharmaceutical processing technology, energy storage and conversion, smart grid, and data analytics among others. Papers on the theory in these areas will also be accepted provided the theoretical contribution is aimed at the application and the development of process control techniques. Topics covered include: • Control applications• Process monitoring• Plant-wide control• Process control systems• Control techniques and algorithms• Process modelling and simulation• Design methods Advanced design methods exclude well established and widely studied traditional design techniques such as PID tuning and its many variants. Applications in fields such as control of automotive engines, machinery and robotics are not deemed suitable unless a clear motivation for the relevance to process control is provided.
期刊最新文献
Closed-loop training of static output feedback neural network controllers for large systems: A distillation case study A survey and experimental study for embedding-aware generative models: Features, models, and any-shot scenarios Physics-informed neural networks for multi-stage Koopman modeling of microbial fermentation processes Image based Modeling and Control for Batch Processes Pruned tree-structured temporal convolutional networks for quality variable prediction of industrial process
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1