LLaMEA: A Large Language Model Evolutionary Algorithm for Automatically Generating Metaheuristics

IF 11.7 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE IEEE Transactions on Evolutionary Computation Pub Date : 2024-11-13 DOI:10.1109/TEVC.2024.3497793
Niki van Stein;Thomas Bäck
{"title":"LLaMEA: A Large Language Model Evolutionary Algorithm for Automatically Generating Metaheuristics","authors":"Niki van Stein;Thomas Bäck","doi":"10.1109/TEVC.2024.3497793","DOIUrl":null,"url":null,"abstract":"Large language models (LLMs), such as GPT-4 have demonstrated their ability to understand natural language and generate complex code snippets. This article introduces a novel LLM evolutionary algorithm (LLaMEA) framework, leveraging GPT models for the automated generation and refinement of algorithms. Given a set of criteria and a task definition (the search space), LLaMEA iteratively generates, mutates, and selects algorithms based on performance metrics and feedback from runtime evaluations. This framework offers a unique approach to generating optimized algorithms without requiring extensive prior expertise. We show how this framework can be used to generate novel closed box metaheuristic optimization algorithms for box-constrained, continuous optimization problems automatically. LLaMEA generates multiple algorithms that outperform state-of-the-art optimization algorithms (covariance matrix adaptation evolution strategy and differential evolution) on the 5-D closed box optimization benchmark (BBOB). The algorithms also show competitive performance on the 10- and 20-D instances of the test functions, although they have not seen such instances during the automated generation process. The results demonstrate the feasibility of the framework and identify future directions for automated generation and optimization of algorithms via LLMs.","PeriodicalId":13206,"journal":{"name":"IEEE Transactions on Evolutionary Computation","volume":"29 2","pages":"331-345"},"PeriodicalIF":11.7000,"publicationDate":"2024-11-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10752628","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Evolutionary Computation","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10752628/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Large language models (LLMs), such as GPT-4 have demonstrated their ability to understand natural language and generate complex code snippets. This article introduces a novel LLM evolutionary algorithm (LLaMEA) framework, leveraging GPT models for the automated generation and refinement of algorithms. Given a set of criteria and a task definition (the search space), LLaMEA iteratively generates, mutates, and selects algorithms based on performance metrics and feedback from runtime evaluations. This framework offers a unique approach to generating optimized algorithms without requiring extensive prior expertise. We show how this framework can be used to generate novel closed box metaheuristic optimization algorithms for box-constrained, continuous optimization problems automatically. LLaMEA generates multiple algorithms that outperform state-of-the-art optimization algorithms (covariance matrix adaptation evolution strategy and differential evolution) on the 5-D closed box optimization benchmark (BBOB). The algorithms also show competitive performance on the 10- and 20-D instances of the test functions, although they have not seen such instances during the automated generation process. The results demonstrate the feasibility of the framework and identify future directions for automated generation and optimization of algorithms via LLMs.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
LLaMEA:自动生成元搜索的大型语言模型进化算法
大型语言模型(llm),如GPT-4已经证明了它们理解自然语言和生成复杂代码片段的能力。本文介绍了一种新的LLM进化算法(LLaMEA)框架,利用GPT模型自动生成和改进算法。给定一组标准和任务定义(搜索空间),LLaMEA根据性能指标和运行时评估的反馈迭代地生成、改变和选择算法。该框架提供了一种独特的方法来生成优化算法,而不需要广泛的先前专业知识。我们展示了如何使用该框架为盒约束的连续优化问题自动生成新颖的闭盒元启发式优化算法。LLaMEA生成的多种算法在5维闭盒优化基准(BBOB)上优于最先进的优化算法(协方差矩阵适应进化策略和差分进化)。算法在测试函数的10- d和20-D实例上也显示出具有竞争力的性能,尽管它们在自动化生成过程中没有看到这样的实例。结果证明了该框架的可行性,并确定了通过llm自动生成和优化算法的未来方向。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
IEEE Transactions on Evolutionary Computation
IEEE Transactions on Evolutionary Computation 工程技术-计算机:理论方法
CiteScore
21.90
自引率
9.80%
发文量
196
审稿时长
3.6 months
期刊介绍: The IEEE Transactions on Evolutionary Computation is published by the IEEE Computational Intelligence Society on behalf of 13 societies: Circuits and Systems; Computer; Control Systems; Engineering in Medicine and Biology; Industrial Electronics; Industry Applications; Lasers and Electro-Optics; Oceanic Engineering; Power Engineering; Robotics and Automation; Signal Processing; Social Implications of Technology; and Systems, Man, and Cybernetics. The journal publishes original papers in evolutionary computation and related areas such as nature-inspired algorithms, population-based methods, optimization, and hybrid systems. It welcomes both purely theoretical papers and application papers that provide general insights into these areas of computation.
期刊最新文献
Fully Tensorized GPU-Accelerated Multi-Population Evolutionary Algorithm for Constrained Multiobjective Optimization Problems A Level-Based Multi-Population Self-Adaptive Constrained Multiobjective Evolutionary Algorithm for Cascade Reservoir Scheduling Evolutionary Optimization of Physics-Informed Neural Networks: Advancing Generalizability by the Baldwin Effect Multi-Population Co-Evolutionary Generative Adversarial Network Architecture Search for Zero-Shot Learning Multi-Tree Genetic Programming With Deep Contextual Bandits for Learning-Assisted Scheduling of Cluster Tools Integrating Stockers
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1