{"title":"Iteration and stochastic first-order oracle complexities of stochastic gradient descent using constant and decaying learning rates","authors":"Kento Imaizumi, Hideaki Iiduka","doi":"10.1080/02331934.2024.2367635","DOIUrl":null,"url":null,"abstract":"The performance of stochastic gradient descent (SGD), which is the simplest first-order optimizer for training deep neural networks, depends on not only the learning rate but also the batch size. T...","PeriodicalId":54671,"journal":{"name":"Optimization","volume":"157 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1080/02331934.2024.2367635","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
The performance of stochastic gradient descent (SGD), which is the simplest first-order optimizer for training deep neural networks, depends on not only the learning rate but also the batch size. T...
期刊介绍:
Optimization publishes refereed, theoretical and applied papers on the latest developments in fields such as linear, nonlinear, stochastic, parametric, discrete and dynamic programming, control theory and game theory.
A special section is devoted to review papers on theory and methods in interesting areas of mathematical programming and optimization techniques. The journal also publishes conference proceedings, book reviews and announcements.
All published research articles in this journal have undergone rigorous peer review, based on initial editor screening and anonymous refereeing by independent expert referees.