{"title":"随机优化的单时间尺度随机拟牛顿方法","authors":"Peng Wang, Detong Zhu","doi":"10.1080/00207160.2023.2269430","DOIUrl":null,"url":null,"abstract":"AbstractIn this paper, we propose a single timescale stochastic quasi-Newton method for solving the stochastic optimization problems. The objective function of the problem is a composition of two smooth functions and their derivatives are not available. The algorithm sets to approximate sequences to estimate the gradient of the composite objective function and the inner function. The matrix correction parameters are given in BFGS update form for avoiding the assumption that Hessian matrix of objective is positive definite. We show the global convergence of the algorithm. The algorithm achieves the complexity O(ϵ−1) to find an ϵ−approximate stationary point and ensure that the expectation of the squared norm of the gradient is smaller than the given accuracy tolerance ϵ. The numerical results of nonconvex binary classification problem using the support vector machine and a multicall classification problem using neural networks are reported to show the effectiveness of the algorithm.Keywords: stochastic optimizationquasi-Newton methodBFGS update techniquemachine learning2010: 49M3765K0590C3090C56DisclaimerAs a service to authors and researchers we are providing this version of an accepted manuscript (AM). Copyediting, typesetting, and review of the resulting proofs will be undertaken on this manuscript before final publication of the Version of Record (VoR). During production and pre-press, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal relate to these versions also. AcknowledgmentsThe author thanks the support of National Natural Science Foundation (11371253) and Hainan Natural Science Foundation (120MS029).","PeriodicalId":13911,"journal":{"name":"International Journal of Computer Mathematics","volume":"4 1","pages":"0"},"PeriodicalIF":1.7000,"publicationDate":"2023-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A single timescale stochastic quasi-Newton method for stochastic optimization\",\"authors\":\"Peng Wang, Detong Zhu\",\"doi\":\"10.1080/00207160.2023.2269430\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"AbstractIn this paper, we propose a single timescale stochastic quasi-Newton method for solving the stochastic optimization problems. The objective function of the problem is a composition of two smooth functions and their derivatives are not available. The algorithm sets to approximate sequences to estimate the gradient of the composite objective function and the inner function. The matrix correction parameters are given in BFGS update form for avoiding the assumption that Hessian matrix of objective is positive definite. We show the global convergence of the algorithm. The algorithm achieves the complexity O(ϵ−1) to find an ϵ−approximate stationary point and ensure that the expectation of the squared norm of the gradient is smaller than the given accuracy tolerance ϵ. The numerical results of nonconvex binary classification problem using the support vector machine and a multicall classification problem using neural networks are reported to show the effectiveness of the algorithm.Keywords: stochastic optimizationquasi-Newton methodBFGS update techniquemachine learning2010: 49M3765K0590C3090C56DisclaimerAs a service to authors and researchers we are providing this version of an accepted manuscript (AM). Copyediting, typesetting, and review of the resulting proofs will be undertaken on this manuscript before final publication of the Version of Record (VoR). During production and pre-press, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal relate to these versions also. AcknowledgmentsThe author thanks the support of National Natural Science Foundation (11371253) and Hainan Natural Science Foundation (120MS029).\",\"PeriodicalId\":13911,\"journal\":{\"name\":\"International Journal of Computer Mathematics\",\"volume\":\"4 1\",\"pages\":\"0\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2023-10-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Computer Mathematics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/00207160.2023.2269430\",\"RegionNum\":4,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Computer Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/00207160.2023.2269430","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
A single timescale stochastic quasi-Newton method for stochastic optimization
AbstractIn this paper, we propose a single timescale stochastic quasi-Newton method for solving the stochastic optimization problems. The objective function of the problem is a composition of two smooth functions and their derivatives are not available. The algorithm sets to approximate sequences to estimate the gradient of the composite objective function and the inner function. The matrix correction parameters are given in BFGS update form for avoiding the assumption that Hessian matrix of objective is positive definite. We show the global convergence of the algorithm. The algorithm achieves the complexity O(ϵ−1) to find an ϵ−approximate stationary point and ensure that the expectation of the squared norm of the gradient is smaller than the given accuracy tolerance ϵ. The numerical results of nonconvex binary classification problem using the support vector machine and a multicall classification problem using neural networks are reported to show the effectiveness of the algorithm.Keywords: stochastic optimizationquasi-Newton methodBFGS update techniquemachine learning2010: 49M3765K0590C3090C56DisclaimerAs a service to authors and researchers we are providing this version of an accepted manuscript (AM). Copyediting, typesetting, and review of the resulting proofs will be undertaken on this manuscript before final publication of the Version of Record (VoR). During production and pre-press, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal relate to these versions also. AcknowledgmentsThe author thanks the support of National Natural Science Foundation (11371253) and Hainan Natural Science Foundation (120MS029).
期刊介绍:
International Journal of Computer Mathematics (IJCM) is a world-leading journal serving the community of researchers in numerical analysis and scientific computing from academia to industry. IJCM publishes original research papers of high scientific value in fields of computational mathematics with profound applications to science and engineering.
IJCM welcomes papers on the analysis and applications of innovative computational strategies as well as those with rigorous explorations of cutting-edge techniques and concerns in computational mathematics. Topics IJCM considers include:
• Numerical solutions of systems of partial differential equations
• Numerical solution of systems or of multi-dimensional partial differential equations
• Theory and computations of nonlocal modelling and fractional partial differential equations
• Novel multi-scale modelling and computational strategies
• Parallel computations
• Numerical optimization and controls
• Imaging algorithms and vision configurations
• Computational stochastic processes and inverse problems
• Stochastic partial differential equations, Monte Carlo simulations and uncertainty quantification
• Computational finance and applications
• Highly vibrant and robust algorithms, and applications in modern industries, including but not limited to multi-physics, economics and biomedicine.
Papers discussing only variations or combinations of existing methods without significant new computational properties or analysis are not of interest to IJCM.
Please note that research in the development of computer systems and theory of computing are not suitable for submission to IJCM. Please instead consider International Journal of Computer Mathematics: Computer Systems Theory (IJCM: CST) for your manuscript. Please note that any papers submitted relating to these fields will be transferred to IJCM:CST. Please ensure you submit your paper to the correct journal to save time reviewing and processing your work.
Papers developed from Conference Proceedings
Please note that papers developed from conference proceedings or previously published work must contain at least 40% new material and significantly extend or improve upon earlier research in order to be considered for IJCM.