S. Bellavia , G. Gurioli , B. Morini , Ph.L. Toint
{"title":"可信域算法:概率复杂性和内在噪声与应用于子采样技术","authors":"S. Bellavia , G. Gurioli , B. Morini , Ph.L. Toint","doi":"10.1016/j.ejco.2022.100043","DOIUrl":null,"url":null,"abstract":"<div><p>A trust-region algorithm is presented for finding approximate minimizers of smooth unconstrained functions whose values and derivatives are subject to random noise. It is shown that, under suitable probabilistic assumptions, the new method finds (in expectation) an <em>ϵ</em>-approximate minimizer of arbitrary order <span><math><mi>q</mi><mo>≥</mo><mn>1</mn></math></span> in at most <span><math><mi>O</mi><mo>(</mo><msup><mrow><mi>ϵ</mi></mrow><mrow><mo>−</mo><mo>(</mo><mi>q</mi><mo>+</mo><mn>1</mn><mo>)</mo></mrow></msup><mo>)</mo></math></span> inexact evaluations of the function and its derivatives, providing the first such result for general optimality orders. The impact of intrinsic noise limiting the validity of the assumptions is also discussed and it is shown that difficulties are unlikely to occur in the first-order version of the algorithm for sufficiently large gradients. Conversely, should these assumptions fail for specific realizations, then “degraded” optimality guarantees are shown to hold when failure occurs. These conclusions are then discussed and illustrated in the context of subsampling methods for finite-sum optimization.</p></div>","PeriodicalId":51880,"journal":{"name":"EURO Journal on Computational Optimization","volume":"10 ","pages":"Article 100043"},"PeriodicalIF":2.6000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2192440622000193/pdfft?md5=746d8300ed25b919398d91159dcb575f&pid=1-s2.0-S2192440622000193-main.pdf","citationCount":"4","resultStr":"{\"title\":\"Trust-region algorithms: Probabilistic complexity and intrinsic noise with applications to subsampling techniques\",\"authors\":\"S. Bellavia , G. Gurioli , B. Morini , Ph.L. Toint\",\"doi\":\"10.1016/j.ejco.2022.100043\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>A trust-region algorithm is presented for finding approximate minimizers of smooth unconstrained functions whose values and derivatives are subject to random noise. It is shown that, under suitable probabilistic assumptions, the new method finds (in expectation) an <em>ϵ</em>-approximate minimizer of arbitrary order <span><math><mi>q</mi><mo>≥</mo><mn>1</mn></math></span> in at most <span><math><mi>O</mi><mo>(</mo><msup><mrow><mi>ϵ</mi></mrow><mrow><mo>−</mo><mo>(</mo><mi>q</mi><mo>+</mo><mn>1</mn><mo>)</mo></mrow></msup><mo>)</mo></math></span> inexact evaluations of the function and its derivatives, providing the first such result for general optimality orders. The impact of intrinsic noise limiting the validity of the assumptions is also discussed and it is shown that difficulties are unlikely to occur in the first-order version of the algorithm for sufficiently large gradients. Conversely, should these assumptions fail for specific realizations, then “degraded” optimality guarantees are shown to hold when failure occurs. These conclusions are then discussed and illustrated in the context of subsampling methods for finite-sum optimization.</p></div>\",\"PeriodicalId\":51880,\"journal\":{\"name\":\"EURO Journal on Computational Optimization\",\"volume\":\"10 \",\"pages\":\"Article 100043\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2022-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2192440622000193/pdfft?md5=746d8300ed25b919398d91159dcb575f&pid=1-s2.0-S2192440622000193-main.pdf\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"EURO Journal on Computational Optimization\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2192440622000193\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"OPERATIONS RESEARCH & MANAGEMENT SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"EURO Journal on Computational Optimization","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2192440622000193","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"OPERATIONS RESEARCH & MANAGEMENT SCIENCE","Score":null,"Total":0}
Trust-region algorithms: Probabilistic complexity and intrinsic noise with applications to subsampling techniques
A trust-region algorithm is presented for finding approximate minimizers of smooth unconstrained functions whose values and derivatives are subject to random noise. It is shown that, under suitable probabilistic assumptions, the new method finds (in expectation) an ϵ-approximate minimizer of arbitrary order in at most inexact evaluations of the function and its derivatives, providing the first such result for general optimality orders. The impact of intrinsic noise limiting the validity of the assumptions is also discussed and it is shown that difficulties are unlikely to occur in the first-order version of the algorithm for sufficiently large gradients. Conversely, should these assumptions fail for specific realizations, then “degraded” optimality guarantees are shown to hold when failure occurs. These conclusions are then discussed and illustrated in the context of subsampling methods for finite-sum optimization.
期刊介绍:
The aim of this journal is to contribute to the many areas in which Operations Research and Computer Science are tightly connected with each other. More precisely, the common element in all contributions to this journal is the use of computers for the solution of optimization problems. Both methodological contributions and innovative applications are considered, but validation through convincing computational experiments is desirable. The journal publishes three types of articles (i) research articles, (ii) tutorials, and (iii) surveys. A research article presents original methodological contributions. A tutorial provides an introduction to an advanced topic designed to ease the use of the relevant methodology. A survey provides a wide overview of a given subject by summarizing and organizing research results.