{"title":"全二次方建模及其在随机子空间无导数方法中的应用","authors":"Yiwen Chen, Warren Hare, Amy Wiebe","doi":"10.1007/s10589-024-00590-8","DOIUrl":null,"url":null,"abstract":"<p>Model-based derivative-free optimization (DFO) methods are an important class of DFO methods that are known to struggle with solving high-dimensional optimization problems. Recent research has shown that incorporating random subspaces into model-based DFO methods has the potential to improve their performance on high-dimensional problems. However, most of the current theoretical and practical results are based on linear approximation models due to the complexity of quadratic approximation models. This paper proposes a random subspace trust-region algorithm based on quadratic approximations. Unlike most of its precursors, this algorithm does not require any special form of objective function. We study the geometry of sample sets, the error bounds for approximations, and the quality of subspaces. In particular, we provide a technique to construct <i>Q</i>-fully quadratic models, which is easy to analyze and implement. We present an almost-sure global convergence result of our algorithm and give an upper bound on the expected number of iterations to find a sufficiently small gradient. We also develop numerical experiments to compare the performance of our algorithm using both linear and quadratic approximation models. The numerical results demonstrate the strengths and weaknesses of using quadratic approximations.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"17 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Q-fully quadratic modeling and its application in a random subspace derivative-free method\",\"authors\":\"Yiwen Chen, Warren Hare, Amy Wiebe\",\"doi\":\"10.1007/s10589-024-00590-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Model-based derivative-free optimization (DFO) methods are an important class of DFO methods that are known to struggle with solving high-dimensional optimization problems. Recent research has shown that incorporating random subspaces into model-based DFO methods has the potential to improve their performance on high-dimensional problems. However, most of the current theoretical and practical results are based on linear approximation models due to the complexity of quadratic approximation models. This paper proposes a random subspace trust-region algorithm based on quadratic approximations. Unlike most of its precursors, this algorithm does not require any special form of objective function. We study the geometry of sample sets, the error bounds for approximations, and the quality of subspaces. In particular, we provide a technique to construct <i>Q</i>-fully quadratic models, which is easy to analyze and implement. We present an almost-sure global convergence result of our algorithm and give an upper bound on the expected number of iterations to find a sufficiently small gradient. We also develop numerical experiments to compare the performance of our algorithm using both linear and quadratic approximation models. The numerical results demonstrate the strengths and weaknesses of using quadratic approximations.</p>\",\"PeriodicalId\":55227,\"journal\":{\"name\":\"Computational Optimization and Applications\",\"volume\":\"17 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Optimization and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10589-024-00590-8\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Optimization and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-024-00590-8","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
Q-fully quadratic modeling and its application in a random subspace derivative-free method
Model-based derivative-free optimization (DFO) methods are an important class of DFO methods that are known to struggle with solving high-dimensional optimization problems. Recent research has shown that incorporating random subspaces into model-based DFO methods has the potential to improve their performance on high-dimensional problems. However, most of the current theoretical and practical results are based on linear approximation models due to the complexity of quadratic approximation models. This paper proposes a random subspace trust-region algorithm based on quadratic approximations. Unlike most of its precursors, this algorithm does not require any special form of objective function. We study the geometry of sample sets, the error bounds for approximations, and the quality of subspaces. In particular, we provide a technique to construct Q-fully quadratic models, which is easy to analyze and implement. We present an almost-sure global convergence result of our algorithm and give an upper bound on the expected number of iterations to find a sufficiently small gradient. We also develop numerical experiments to compare the performance of our algorithm using both linear and quadratic approximation models. The numerical results demonstrate the strengths and weaknesses of using quadratic approximations.
期刊介绍:
Computational Optimization and Applications is a peer reviewed journal that is committed to timely publication of research and tutorial papers on the analysis and development of computational algorithms and modeling technology for optimization. Algorithms either for general classes of optimization problems or for more specific applied problems are of interest. Stochastic algorithms as well as deterministic algorithms will be considered. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome.
Topics of interest include, but are not limited to the following:
Large Scale Optimization,
Unconstrained Optimization,
Linear Programming,
Quadratic Programming Complementarity Problems, and Variational Inequalities,
Constrained Optimization,
Nondifferentiable Optimization,
Integer Programming,
Combinatorial Optimization,
Stochastic Optimization,
Multiobjective Optimization,
Network Optimization,
Complexity Theory,
Approximations and Error Analysis,
Parametric Programming and Sensitivity Analysis,
Parallel Computing, Distributed Computing, and Vector Processing,
Software, Benchmarks, Numerical Experimentation and Comparisons,
Modelling Languages and Systems for Optimization,
Automatic Differentiation,
Applications in Engineering, Finance, Optimal Control, Optimal Design, Operations Research,
Transportation, Economics, Communications, Manufacturing, and Management Science.