A Data Stream Ensemble Assisted Multifactorial Evolutionary Algorithm for Offline Data-Driven Dynamic Optimization.

IF 4.6 2区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Evolutionary Computation Pub Date : 2023-12-01 DOI:10.1162/evco_a_00332
Cuie Yang, Jinliang Ding, Yaochu Jin, Tianyou Chai
{"title":"A Data Stream Ensemble Assisted Multifactorial Evolutionary Algorithm for Offline Data-Driven Dynamic Optimization.","authors":"Cuie Yang, Jinliang Ding, Yaochu Jin, Tianyou Chai","doi":"10.1162/evco_a_00332","DOIUrl":null,"url":null,"abstract":"<p><p>Existing work on offline data-driven optimization mainly focuses on problems in static environments, and little attention has been paid to problems in dynamic environments. Offline data-driven optimization in dynamic environments is a challenging problem because the distribution of collected data varies over time, requiring surrogate models and optimal solutions tracking with time. This paper proposes a knowledge-transfer-based data-driven optimization algorithm to address these issues. First, an ensemble learning method is adopted to train surrogate models to leverage the knowledge of data in historical environments as well as adapt to new environments. Specifically, given data in a new environment, a model is constructed with the new data, and the preserved models of historical environments are further trained with the new data. Then, these models are considered to be base learners and combined as an ensemble surrogate model. After that, all base learners and the ensemble surrogate model are simultaneously optimized in a multitask environment for finding optimal solutions for real fitness functions. In this way, the optimization tasks in the previous environments can be used to accelerate the tracking of the optimum in the current environment. Since the ensemble model is the most accurate surrogate, we assign more individuals to the ensemble surrogate than its base learners. Empirical results on six dynamic optimization benchmark problems demonstrate the effectiveness of the proposed algorithm compared with four state-of-the-art offline data-driven optimization algorithms. Code is available at https://github.com/Peacefulyang/DSE_MFS.git.</p>","PeriodicalId":50470,"journal":{"name":"Evolutionary Computation","volume":" ","pages":"433-458"},"PeriodicalIF":4.6000,"publicationDate":"2023-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Evolutionary Computation","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1162/evco_a_00332","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 3

Abstract

Existing work on offline data-driven optimization mainly focuses on problems in static environments, and little attention has been paid to problems in dynamic environments. Offline data-driven optimization in dynamic environments is a challenging problem because the distribution of collected data varies over time, requiring surrogate models and optimal solutions tracking with time. This paper proposes a knowledge-transfer-based data-driven optimization algorithm to address these issues. First, an ensemble learning method is adopted to train surrogate models to leverage the knowledge of data in historical environments as well as adapt to new environments. Specifically, given data in a new environment, a model is constructed with the new data, and the preserved models of historical environments are further trained with the new data. Then, these models are considered to be base learners and combined as an ensemble surrogate model. After that, all base learners and the ensemble surrogate model are simultaneously optimized in a multitask environment for finding optimal solutions for real fitness functions. In this way, the optimization tasks in the previous environments can be used to accelerate the tracking of the optimum in the current environment. Since the ensemble model is the most accurate surrogate, we assign more individuals to the ensemble surrogate than its base learners. Empirical results on six dynamic optimization benchmark problems demonstrate the effectiveness of the proposed algorithm compared with four state-of-the-art offline data-driven optimization algorithms. Code is available at https://github.com/Peacefulyang/DSE_MFS.git.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种数据流集成辅助的多因子进化算法用于离线数据驱动的动态优化。
现有的离线数据驱动优化研究主要集中在静态环境下的问题,对动态环境下的问题关注较少。动态环境中的离线数据驱动优化是一个具有挑战性的问题,因为所收集数据的分布随时间而变化,需要代理模型和随时间跟踪的最优解决方案。本文提出了一种基于知识转移的数据驱动优化算法来解决这些问题。首先,采用集成学习方法训练代理模型,以利用历史环境中的数据知识并适应新环境。具体而言,在给定新环境中的数据后,使用新数据构建模型,并使用新数据进一步训练保留的历史环境模型。然后,将这些模型视为基础学习器并组合为集成代理模型。然后,在多任务环境中同时优化所有基础学习器和集成代理模型,以寻找真实适应度函数的最优解。这样,就可以利用之前环境中的优化任务来加速当前环境中最优的跟踪。由于集成模型是最准确的代理,我们将更多的个体分配给集成代理,而不是其基础学习器。六个动态优化基准问题的实证结果表明,与四种最先进的离线数据驱动优化算法相比,本文提出的算法是有效的。代码可从https://github.com/Peacefulyang/DSE_MFS.git获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Evolutionary Computation
Evolutionary Computation 工程技术-计算机:理论方法
CiteScore
6.40
自引率
1.50%
发文量
20
审稿时长
3 months
期刊介绍: Evolutionary Computation is a leading journal in its field. It provides an international forum for facilitating and enhancing the exchange of information among researchers involved in both the theoretical and practical aspects of computational systems drawing their inspiration from nature, with particular emphasis on evolutionary models of computation such as genetic algorithms, evolutionary strategies, classifier systems, evolutionary programming, and genetic programming. It welcomes articles from related fields such as swarm intelligence (e.g. Ant Colony Optimization and Particle Swarm Optimization), and other nature-inspired computation paradigms (e.g. Artificial Immune Systems). As well as publishing articles describing theoretical and/or experimental work, the journal also welcomes application-focused papers describing breakthrough results in an application domain or methodological papers where the specificities of the real-world problem led to significant algorithmic improvements that could possibly be generalized to other areas.
期刊最新文献
Territorial Differential Meta-Evolution: An Algorithm for Seeking All the Desirable Optima of a Multivariable Function. Parameterless Gene-Pool Optimal Mixing Evolutionary Algorithms. Informed Down-Sampled Lexicase Selection: Identifying Productive Training Cases for Efficient Problem Solving. Estimation of Distribution Algorithm for Grammar-Guided Genetic Programming. Virtual Position Guided Strategy for Particle Swarm Optimization Algorithms on Multimodal Problems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1