{"title":"Stochastic alternating structure-adapted proximal gradient descent method with variance reduction for nonconvex nonsmooth optimization","authors":"Zehui Jia, Wenxing Zhang, Xingju Cai, Deren Han","doi":"10.1090/mcom/3867","DOIUrl":null,"url":null,"abstract":"<p>The blocky optimization has gained a significant amount of attention in far-reaching practical applications. Following the recent work (M. Nikolova and P. Tan [SIAM J. Optim. 29 (2019), pp. 2053–2078]) on solving a class of nonconvex nonsmooth optimization, we develop a stochastic alternating structure-adapted proximal (s-ASAP) gradient descent method for solving blocky optimization problems. By deploying some state-of-the-art variance reduced gradient estimators (rather than full gradient) in stochastic optimization, the s-ASAP method is applicable to nonconvex optimization whose objective is the sum of a nonsmooth data-fitting term and a finite number of differentiable functions. The sublinear convergence rate of s-ASAP is built upon the proximal point algorithmic framework, whilst the linear convergence rate of s-ASAP is achieved under the error bound condition. Furthermore, the convergence of the sequence produced by s-ASAP is established under the Kurdyka-Łojasiewicz property. Preliminary numerical simulations on some image processing applications demonstrate the compelling performance of the proposed method.</p>","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1090/mcom/3867","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0
Abstract
The blocky optimization has gained a significant amount of attention in far-reaching practical applications. Following the recent work (M. Nikolova and P. Tan [SIAM J. Optim. 29 (2019), pp. 2053–2078]) on solving a class of nonconvex nonsmooth optimization, we develop a stochastic alternating structure-adapted proximal (s-ASAP) gradient descent method for solving blocky optimization problems. By deploying some state-of-the-art variance reduced gradient estimators (rather than full gradient) in stochastic optimization, the s-ASAP method is applicable to nonconvex optimization whose objective is the sum of a nonsmooth data-fitting term and a finite number of differentiable functions. The sublinear convergence rate of s-ASAP is built upon the proximal point algorithmic framework, whilst the linear convergence rate of s-ASAP is achieved under the error bound condition. Furthermore, the convergence of the sequence produced by s-ASAP is established under the Kurdyka-Łojasiewicz property. Preliminary numerical simulations on some image processing applications demonstrate the compelling performance of the proposed method.
块状优化在意义深远的实际应用中获得了大量关注。继最近关于求解一类非凸非光滑优化的工作(M. Nikolova 和 P. Tan [SIAM J. Optim. 29 (2019),pp. 2053-2078])之后,我们开发了一种用于求解块状优化问题的随机交替结构适应近似(s-ASAP)梯度下降方法。通过采用随机优化中一些最先进的方差缩小梯度估计器(而不是全梯度),s-ASAP 方法适用于目标为非光滑数据拟合项与有限个可微分函数之和的非凸优化。s-ASAP的亚线性收敛率建立在近点算法框架之上,而s-ASAP的线性收敛率是在误差约束条件下实现的。此外,s-ASAP 生成的序列的收敛性是在 Kurdyka-Łojasiewicz 属性下确定的。对一些图像处理应用的初步数值模拟证明了所提方法的卓越性能。