{"title":"基于随机后裔的蜂群优化","authors":"Eitan Tadmor, Anil Zenginoğlu","doi":"10.1007/s10440-024-00639-0","DOIUrl":null,"url":null,"abstract":"<div><p>We extend our study of the swarm-based gradient descent method for non-convex optimization, (Lu et al., Swarm-based gradient descent method for non-convex optimization, 2022, arXiv:2211.17157), to allow random descent directions. We recall that the swarm-based approach consists of a swarm of agents, each identified with a position, <span>\\(\\mathbf{x}\\)</span>, and mass, <span>\\(m\\)</span>. The key is the transfer of mass from high ground to low(-est) ground. The mass of an agent dictates its step size: lighter agents take larger steps. In this paper, the essential new feature is the choice of direction: rather than restricting the swarm to march in the steepest gradient descent, we let agents proceed in randomly chosen directions centered around — but otherwise different from — the gradient direction. The random search secures the descent property while at the same time, enabling greater exploration of ambient space. Convergence analysis and benchmark optimizations demonstrate the effectiveness of the swarm-based random descent method as a multi-dimensional global optimizer.</p></div>","PeriodicalId":53132,"journal":{"name":"Acta Applicandae Mathematicae","volume":"190 1","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2024-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Swarm-Based Optimization with Random Descent\",\"authors\":\"Eitan Tadmor, Anil Zenginoğlu\",\"doi\":\"10.1007/s10440-024-00639-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>We extend our study of the swarm-based gradient descent method for non-convex optimization, (Lu et al., Swarm-based gradient descent method for non-convex optimization, 2022, arXiv:2211.17157), to allow random descent directions. We recall that the swarm-based approach consists of a swarm of agents, each identified with a position, <span>\\\\(\\\\mathbf{x}\\\\)</span>, and mass, <span>\\\\(m\\\\)</span>. The key is the transfer of mass from high ground to low(-est) ground. The mass of an agent dictates its step size: lighter agents take larger steps. In this paper, the essential new feature is the choice of direction: rather than restricting the swarm to march in the steepest gradient descent, we let agents proceed in randomly chosen directions centered around — but otherwise different from — the gradient direction. The random search secures the descent property while at the same time, enabling greater exploration of ambient space. Convergence analysis and benchmark optimizations demonstrate the effectiveness of the swarm-based random descent method as a multi-dimensional global optimizer.</p></div>\",\"PeriodicalId\":53132,\"journal\":{\"name\":\"Acta Applicandae Mathematicae\",\"volume\":\"190 1\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2024-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Acta Applicandae Mathematicae\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://link.springer.com/article/10.1007/s10440-024-00639-0\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Acta Applicandae Mathematicae","FirstCategoryId":"100","ListUrlMain":"https://link.springer.com/article/10.1007/s10440-024-00639-0","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
We extend our study of the swarm-based gradient descent method for non-convex optimization, (Lu et al., Swarm-based gradient descent method for non-convex optimization, 2022, arXiv:2211.17157), to allow random descent directions. We recall that the swarm-based approach consists of a swarm of agents, each identified with a position, \(\mathbf{x}\), and mass, \(m\). The key is the transfer of mass from high ground to low(-est) ground. The mass of an agent dictates its step size: lighter agents take larger steps. In this paper, the essential new feature is the choice of direction: rather than restricting the swarm to march in the steepest gradient descent, we let agents proceed in randomly chosen directions centered around — but otherwise different from — the gradient direction. The random search secures the descent property while at the same time, enabling greater exploration of ambient space. Convergence analysis and benchmark optimizations demonstrate the effectiveness of the swarm-based random descent method as a multi-dimensional global optimizer.
期刊介绍:
Acta Applicandae Mathematicae is devoted to the art and techniques of applying mathematics and the development of new, applicable mathematical methods.
Covering a large spectrum from modeling to qualitative analysis and computational methods, Acta Applicandae Mathematicae contains papers on different aspects of the relationship between theory and applications, ranging from descriptive papers on actual applications meeting contemporary mathematical standards to proofs of new and deep theorems in applied mathematics.