通过自适应采样程序改进无导数优化算法

Emmanouil Karantoumanis, Nikolaos Ploskas
{"title":"通过自适应采样程序改进无导数优化算法","authors":"Emmanouil Karantoumanis,&nbsp;Nikolaos Ploskas","doi":"10.1016/j.rico.2024.100460","DOIUrl":null,"url":null,"abstract":"<div><p>Black-box optimization plays a pivotal role in addressing complex real-world problems where the underlying mathematical model is unknown or expensive to evaluate. In this context, this work presents a method to enhance the performance of derivative-free optimization algorithms by integrating an adaptive sampling process. The proposed methodology aims to overcome the limitations of traditional methods by intelligently guiding the search towards promising regions of the search space. To achieve this, we utilize machine learning models, which effectively substitute first principles models. Furthermore, we employ the error maximization approach to steer the exploration towards areas where the surrogate model deviates significantly from the true model. Moreover, we involve a heuristic method, an adaptive sampling procedure, that repeats calls to a widely-used derivative-free optimization algorithm, SNOBFIT, allowing for the creation of new and improved surrogate models. To evaluate the efficiency of the proposed method, we conduct a comparative analysis across a benchmark set of 776 continuous problems. Our findings indicate that our approach successfully solved 93% of the problems. Notably, for larger problems, our method outperformed the standard SNOBFIT algorithm by achieving a 19% increase in problem-solving rate, and when, we introduced an additional termination criterion to enhance computational efficiency, the proposed method achieved a 31% time reduction compared to SNOBFIT.</p></div>","PeriodicalId":34733,"journal":{"name":"Results in Control and Optimization","volume":"16 ","pages":"Article 100460"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2666720724000900/pdfft?md5=a5ebaf507a55902356b0db914dacff66&pid=1-s2.0-S2666720724000900-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Improving derivative-free optimization algorithms through an adaptive sampling procedure\",\"authors\":\"Emmanouil Karantoumanis,&nbsp;Nikolaos Ploskas\",\"doi\":\"10.1016/j.rico.2024.100460\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Black-box optimization plays a pivotal role in addressing complex real-world problems where the underlying mathematical model is unknown or expensive to evaluate. In this context, this work presents a method to enhance the performance of derivative-free optimization algorithms by integrating an adaptive sampling process. The proposed methodology aims to overcome the limitations of traditional methods by intelligently guiding the search towards promising regions of the search space. To achieve this, we utilize machine learning models, which effectively substitute first principles models. Furthermore, we employ the error maximization approach to steer the exploration towards areas where the surrogate model deviates significantly from the true model. Moreover, we involve a heuristic method, an adaptive sampling procedure, that repeats calls to a widely-used derivative-free optimization algorithm, SNOBFIT, allowing for the creation of new and improved surrogate models. To evaluate the efficiency of the proposed method, we conduct a comparative analysis across a benchmark set of 776 continuous problems. Our findings indicate that our approach successfully solved 93% of the problems. Notably, for larger problems, our method outperformed the standard SNOBFIT algorithm by achieving a 19% increase in problem-solving rate, and when, we introduced an additional termination criterion to enhance computational efficiency, the proposed method achieved a 31% time reduction compared to SNOBFIT.</p></div>\",\"PeriodicalId\":34733,\"journal\":{\"name\":\"Results in Control and Optimization\",\"volume\":\"16 \",\"pages\":\"Article 100460\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2666720724000900/pdfft?md5=a5ebaf507a55902356b0db914dacff66&pid=1-s2.0-S2666720724000900-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Results in Control and Optimization\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666720724000900\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"Mathematics\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Results in Control and Optimization","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666720724000900","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0

摘要

黑箱优化在解决底层数学模型未知或评估成本高昂的复杂现实问题中发挥着举足轻重的作用。在这种情况下,本研究提出了一种通过整合自适应采样过程来提高无导数优化算法性能的方法。所提出的方法旨在克服传统方法的局限性,智能地将搜索引向搜索空间的有希望区域。为此,我们利用机器学习模型来有效替代第一原理模型。此外,我们还采用误差最大化方法,将探索导向代用模型与真实模型存在显著偏差的区域。此外,我们还采用了一种启发式方法--自适应采样程序,重复调用广泛使用的无衍生优化算法 SNOBFIT,从而创建出新的、改进的代用模型。为了评估所提方法的效率,我们对 776 个连续问题的基准集进行了比较分析。结果表明,我们的方法成功解决了 93% 的问题。值得注意的是,对于较大的问题,我们的方法优于标准的 SNOBFIT 算法,问题解决率提高了 19%;当我们引入额外的终止准则以提高计算效率时,与 SNOBFIT 相比,我们提出的方法缩短了 31% 的时间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Improving derivative-free optimization algorithms through an adaptive sampling procedure

Black-box optimization plays a pivotal role in addressing complex real-world problems where the underlying mathematical model is unknown or expensive to evaluate. In this context, this work presents a method to enhance the performance of derivative-free optimization algorithms by integrating an adaptive sampling process. The proposed methodology aims to overcome the limitations of traditional methods by intelligently guiding the search towards promising regions of the search space. To achieve this, we utilize machine learning models, which effectively substitute first principles models. Furthermore, we employ the error maximization approach to steer the exploration towards areas where the surrogate model deviates significantly from the true model. Moreover, we involve a heuristic method, an adaptive sampling procedure, that repeats calls to a widely-used derivative-free optimization algorithm, SNOBFIT, allowing for the creation of new and improved surrogate models. To evaluate the efficiency of the proposed method, we conduct a comparative analysis across a benchmark set of 776 continuous problems. Our findings indicate that our approach successfully solved 93% of the problems. Notably, for larger problems, our method outperformed the standard SNOBFIT algorithm by achieving a 19% increase in problem-solving rate, and when, we introduced an additional termination criterion to enhance computational efficiency, the proposed method achieved a 31% time reduction compared to SNOBFIT.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Results in Control and Optimization
Results in Control and Optimization Mathematics-Control and Optimization
CiteScore
3.00
自引率
0.00%
发文量
51
审稿时长
91 days
期刊最新文献
A note on “Study on multi-objective linear fractional programming problem involving pentagonal intuitionistic fuzzy number” Frequency regulation of two-area thermal and photovoltaic power system via flood algorithm An efficient parametric kernel function of IPMs for Linear optimization problems Multi-objective optimization of an open-pit mining system to determine safety buffer using the modified NBI method and the meta-model approach COVID-19 detection from optimized features of breathing audio signals using explainable ensemble machine learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1