多目标遗传规划中的持续优化与特征标准化

IF 1.7 3区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Genetic Programming and Evolvable Machines Pub Date : 2021-08-19 DOI:10.1007/s10710-021-09410-y
Rockett, Peter
{"title":"多目标遗传规划中的持续优化与特征标准化","authors":"Rockett, Peter","doi":"10.1007/s10710-021-09410-y","DOIUrl":null,"url":null,"abstract":"<p>This paper extends the numerical tuning of tree constants in genetic programming (GP) to the multiobjective domain. Using ten real-world benchmark regression datasets and employing Bayesian comparison procedures, we first consider the effects of feature standardization (without constant tuning) and conclude that standardization generally produces lower test errors, but, contrary to other recently published work, we find much less clear trend for tree sizes. In addition, we consider the effects of constant tuning – with and without feature standardization – and observe that (1) constant tuning invariably improves test error, and (2) usually decreases tree size. Combined with standardization, constant tuning produces the best test error results; tree sizes, however, are increased. We also examine the effects of applying constant tuning only once at the end a conventional GP run which turns out to be surprisingly promising. Finally, we consider the merits of using numerical procedures to tune tree constants and observe that for around half the datasets evolutionary search alone is superior whereas for the remaining half, parameter tuning is superior. We identify a number of open research questions that arise from this work.</p>","PeriodicalId":50424,"journal":{"name":"Genetic Programming and Evolvable Machines","volume":"30 6","pages":""},"PeriodicalIF":1.7000,"publicationDate":"2021-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Constant optimization and feature standardization in multiobjective genetic programming\",\"authors\":\"Rockett, Peter\",\"doi\":\"10.1007/s10710-021-09410-y\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>This paper extends the numerical tuning of tree constants in genetic programming (GP) to the multiobjective domain. Using ten real-world benchmark regression datasets and employing Bayesian comparison procedures, we first consider the effects of feature standardization (without constant tuning) and conclude that standardization generally produces lower test errors, but, contrary to other recently published work, we find much less clear trend for tree sizes. In addition, we consider the effects of constant tuning – with and without feature standardization – and observe that (1) constant tuning invariably improves test error, and (2) usually decreases tree size. Combined with standardization, constant tuning produces the best test error results; tree sizes, however, are increased. We also examine the effects of applying constant tuning only once at the end a conventional GP run which turns out to be surprisingly promising. Finally, we consider the merits of using numerical procedures to tune tree constants and observe that for around half the datasets evolutionary search alone is superior whereas for the remaining half, parameter tuning is superior. We identify a number of open research questions that arise from this work.</p>\",\"PeriodicalId\":50424,\"journal\":{\"name\":\"Genetic Programming and Evolvable Machines\",\"volume\":\"30 6\",\"pages\":\"\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2021-08-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Genetic Programming and Evolvable Machines\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s10710-021-09410-y\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Genetic Programming and Evolvable Machines","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s10710-021-09410-y","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 1

摘要

本文将遗传规划中树常数的数值整定推广到多目标领域。使用10个真实世界的基准回归数据集并采用贝叶斯比较程序,我们首先考虑特征标准化的影响(没有不断调整),并得出结论,标准化通常产生较低的测试误差,但是,与其他最近发表的工作相反,我们发现树大小的趋势不太明显。此外,我们考虑了恒定调优的影响——有和没有特征标准化——并观察到:(1)恒定调优总是改善测试误差,(2)通常会减小树的大小。结合标准化,不断调谐产生最佳测试误差结果;然而,树木的大小增加了。我们还研究了在常规GP运行结束时只应用一次恒定调优的效果,结果证明这是非常有希望的。最后,我们考虑了使用数值过程来调整树常数的优点,并观察到对于大约一半的数据集,仅进化搜索是优越的,而对于剩下的一半,参数调整是优越的。我们从这项工作中确定了一些开放的研究问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Constant optimization and feature standardization in multiobjective genetic programming

This paper extends the numerical tuning of tree constants in genetic programming (GP) to the multiobjective domain. Using ten real-world benchmark regression datasets and employing Bayesian comparison procedures, we first consider the effects of feature standardization (without constant tuning) and conclude that standardization generally produces lower test errors, but, contrary to other recently published work, we find much less clear trend for tree sizes. In addition, we consider the effects of constant tuning – with and without feature standardization – and observe that (1) constant tuning invariably improves test error, and (2) usually decreases tree size. Combined with standardization, constant tuning produces the best test error results; tree sizes, however, are increased. We also examine the effects of applying constant tuning only once at the end a conventional GP run which turns out to be surprisingly promising. Finally, we consider the merits of using numerical procedures to tune tree constants and observe that for around half the datasets evolutionary search alone is superior whereas for the remaining half, parameter tuning is superior. We identify a number of open research questions that arise from this work.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Genetic Programming and Evolvable Machines
Genetic Programming and Evolvable Machines 工程技术-计算机:理论方法
CiteScore
5.90
自引率
3.80%
发文量
19
审稿时长
6 months
期刊介绍: A unique source reporting on methods for artificial evolution of programs and machines... Reports innovative and significant progress in automatic evolution of software and hardware. Features both theoretical and application papers. Covers hardware implementations, artificial life, molecular computing and emergent computation techniques. Examines such related topics as evolutionary algorithms with variable-size genomes, alternate methods of program induction, approaches to engineering systems development based on embryology, morphogenesis or other techniques inspired by adaptive natural systems.
期刊最新文献
Evolving code with a large language model Hga-lstm: LSTM architecture and hyperparameter search by hybrid GA for air pollution prediction A survey on dynamic populations in bio-inspired algorithms GSGP-hardware: instantaneous symbolic regression with an FPGA implementation of geometric semantic genetic programming Geometric semantic GP with linear scaling: Darwinian versus Lamarckian evolution
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1