基于经典梯度下降变异体比较研究的一种新的全局优化混合优化器

Mouad Touarsi, D. Gretete, Abdelmajid Elouadi
{"title":"基于经典梯度下降变异体比较研究的一种新的全局优化混合优化器","authors":"Mouad Touarsi, D. Gretete, Abdelmajid Elouadi","doi":"10.19139/SOIC-2310-5070-1005","DOIUrl":null,"url":null,"abstract":"In this paper, we present an empirical comparison of some Gradient Descent variants used to solve globaloptimization problems for large search domains. The aim is to identify which one of them is more suitable for solving an optimization problem regardless of the features of the used test function. Five variants of Gradient Descent were implemented in the R language and tested on a benchmark of five test functions. We proved the dependence between the choice of the variant and the obtained performances using the khi-2 test in a sample of 120 experiments. Those test functions vary on convexity, the number of local minima, and are classified according to some criteria. We had chosen a range of values for each algorithm parameter. Results are compared in terms of accuracy and convergence speed. Based on the obtained results,we defined the priority of usage for those variants and we contributed by a new hybrid optimizer. The new optimizer is testedin a benchmark of well-known test functions and two real applications are proposed. Except for the classical gradient descent algorithm, only stochastic versions of those variants are considered in this paper.","PeriodicalId":93376,"journal":{"name":"Statistics, optimization & information computing","volume":"33 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2021-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A New Hybrid Optimizer for Global Optimization Based on a Comparative Study Remarks of Classical Gradient Descent Variants\",\"authors\":\"Mouad Touarsi, D. Gretete, Abdelmajid Elouadi\",\"doi\":\"10.19139/SOIC-2310-5070-1005\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we present an empirical comparison of some Gradient Descent variants used to solve globaloptimization problems for large search domains. The aim is to identify which one of them is more suitable for solving an optimization problem regardless of the features of the used test function. Five variants of Gradient Descent were implemented in the R language and tested on a benchmark of five test functions. We proved the dependence between the choice of the variant and the obtained performances using the khi-2 test in a sample of 120 experiments. Those test functions vary on convexity, the number of local minima, and are classified according to some criteria. We had chosen a range of values for each algorithm parameter. Results are compared in terms of accuracy and convergence speed. Based on the obtained results,we defined the priority of usage for those variants and we contributed by a new hybrid optimizer. The new optimizer is testedin a benchmark of well-known test functions and two real applications are proposed. Except for the classical gradient descent algorithm, only stochastic versions of those variants are considered in this paper.\",\"PeriodicalId\":93376,\"journal\":{\"name\":\"Statistics, optimization & information computing\",\"volume\":\"33 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Statistics, optimization & information computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.19139/SOIC-2310-5070-1005\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistics, optimization & information computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.19139/SOIC-2310-5070-1005","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

在本文中,我们提出了一些用于解决大型搜索域的全局优化问题的梯度下降变量的经验比较。其目的是确定哪一个更适合解决优化问题,而不管所使用的测试函数的特征如何。用R语言实现了梯度下降的五个变体,并在五个测试函数的基准上进行了测试。我们在120个实验样本中使用kh -2测试证明了变体的选择与获得的性能之间的相关性。这些测试函数在凹凸度、局部极小值的数量上有所不同,并根据一些标准进行分类。我们已经为每个算法参数选择了一个值范围。结果在精度和收敛速度方面进行了比较。根据获得的结果,我们定义了这些变量的使用优先级,并贡献了一个新的混合优化器。该优化器在一个知名测试函数的基准测试中进行了测试,并提出了两个实际应用。除了经典的梯度下降算法外,本文只考虑了这些变量的随机版本。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
A New Hybrid Optimizer for Global Optimization Based on a Comparative Study Remarks of Classical Gradient Descent Variants
In this paper, we present an empirical comparison of some Gradient Descent variants used to solve globaloptimization problems for large search domains. The aim is to identify which one of them is more suitable for solving an optimization problem regardless of the features of the used test function. Five variants of Gradient Descent were implemented in the R language and tested on a benchmark of five test functions. We proved the dependence between the choice of the variant and the obtained performances using the khi-2 test in a sample of 120 experiments. Those test functions vary on convexity, the number of local minima, and are classified according to some criteria. We had chosen a range of values for each algorithm parameter. Results are compared in terms of accuracy and convergence speed. Based on the obtained results,we defined the priority of usage for those variants and we contributed by a new hybrid optimizer. The new optimizer is testedin a benchmark of well-known test functions and two real applications are proposed. Except for the classical gradient descent algorithm, only stochastic versions of those variants are considered in this paper.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Statistical Analysis Based on Adaptive Progressive Hybrid Censored Data From Lomax Distribution A Berry-Esseen Bound for Nonlinear Statistics with Bounded Differences The Weibull Distribution: Reliability Characterization Based on Linear and Circular Consecutive Systems Infinity Substitute in Finding Exact Minimum of Total Weighted Tardiness in Tight-Tardy Progressive 1-machine Scheduling by Idling-free Preemptions Testing the Validity of Lindley Model Based on Informational Energy with Application to Real Medical Data
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1