{"title":"Time-discrete momentum consensus-based optimization algorithm and its application to Lyapunov function approximation","authors":"Seung-Yeal Ha, Gyuyoung Hwang, Sungyoon Kim","doi":"10.1142/s0218202524400104","DOIUrl":null,"url":null,"abstract":"<p>In this paper, we study a discrete momentum consensus-based optimization (Momentum-CBO) algorithm which corresponds to a second-order generalization of the discrete first-order CBO [S.-Y. Ha, S. Jin and D. Kim, Convergence of a first-order consensus-based global optimization algorithm, <i>Math. Models Methods Appl. Sci.</i><b>30</b> (2020) 2417–2444]. The proposed algorithm can be understood as the modification of ADAM-CBO, replacing the normalization term by unity. For the proposed Momentum-CBO, we provide a sufficient framework which guarantees the convergence of algorithm toward a global minimum of the objective function. Moreover, we present several experimental results showing that Momentum-CBO has an improved success rate of finding the global minimum compared to vanilla-CBO and show the stability of Momentum-CBO under different initialization schemes. We also show that Momentum-CBO can be used as the alternative of ADAM-CBO which does not have a proper convergence analysis. Finally, we give an application of Momentum-CBO for Lyapunov function approximation using symbolic regression techniques.</p>","PeriodicalId":18311,"journal":{"name":"Mathematical Models and Methods in Applied Sciences","volume":"6 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Mathematical Models and Methods in Applied Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s0218202524400104","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In this paper, we study a discrete momentum consensus-based optimization (Momentum-CBO) algorithm which corresponds to a second-order generalization of the discrete first-order CBO [S.-Y. Ha, S. Jin and D. Kim, Convergence of a first-order consensus-based global optimization algorithm, Math. Models Methods Appl. Sci.30 (2020) 2417–2444]. The proposed algorithm can be understood as the modification of ADAM-CBO, replacing the normalization term by unity. For the proposed Momentum-CBO, we provide a sufficient framework which guarantees the convergence of algorithm toward a global minimum of the objective function. Moreover, we present several experimental results showing that Momentum-CBO has an improved success rate of finding the global minimum compared to vanilla-CBO and show the stability of Momentum-CBO under different initialization schemes. We also show that Momentum-CBO can be used as the alternative of ADAM-CBO which does not have a proper convergence analysis. Finally, we give an application of Momentum-CBO for Lyapunov function approximation using symbolic regression techniques.
在本文中,我们研究了一种基于离散动量共识的优化(Momentum-CBO)算法,它相当于离散一阶 CBO [S.-Y. Ha, S. Jin and D. Kim, Convergence a first-order consensus-based global optimization algorithm, Math.Ha, S. Jin and D. Kim, Convergence of a first-order consensus-based global optimization algorithm, Math. Models Methods Appl.30 (2020) 2417-2444]。所提出的算法可以理解为 ADAM-CBO 的改进,用统一值代替了归一化项。对于所提出的动量-CBO,我们提供了一个充分的框架,保证算法向目标函数的全局最小值收敛。此外,我们给出的几个实验结果表明,与 vanilla-CBO 相比,Momentum-CBO 找到全局最小值的成功率更高,并显示了 Momentum-CBO 在不同初始化方案下的稳定性。我们还证明了 Momentum-CBO 可以作为 ADAM-CBO 的替代方案,因为 ADAM-CBO 没有适当的收敛分析。最后,我们给出了动量-CBO 在使用符号回归技术进行 Lyapunov 函数逼近中的应用。