Mohammad Soud Alkousa, Alexander Vladimirovich Gasnikov, Egor Leonidovich Gladin, Ilya Alekseevich Kuruzov, Dmitry Arkad'evich Pasechnyuk, Fedor Sergeevich Stonyakin
{"title":"Solving strongly convex-concave composite saddle-point problems with low dimension of one group of variable","authors":"Mohammad Soud Alkousa, Alexander Vladimirovich Gasnikov, Egor Leonidovich Gladin, Ilya Alekseevich Kuruzov, Dmitry Arkad'evich Pasechnyuk, Fedor Sergeevich Stonyakin","doi":"10.4213/sm9700e","DOIUrl":null,"url":null,"abstract":"Algorithmic methods are developed that guarantee efficient complexity estimates for strongly convex-concave saddle-point problems in the case when one group of variables has a high dimension, while another has a rather low dimension (up to 100). These methods are based on reducing problems of this type to the minimization (maximization) problem for a convex (concave) functional with respect to one of the variables such that an approximate value of the gradient at an arbitrary point can be obtained with the required accuracy using an auxiliary optimization subproblem with respect to the other variable. It is proposed to use the ellipsoid method and Vaidya's method for low-dimensional problems and accelerated gradient methods with inexact information about the gradient or subgradient for high-dimensional problems. In the case when one group of variables, ranging over a hypercube, has a very low dimension (up to five), another proposed approach to strongly convex-concave saddle-point problems is rather efficient. This approach is based on a new version of a multidimensional analogue of Nesterov's method on a square (the multidimensional dichotomy method) with the possibility to use inexact values of the gradient of the objective functional. Bibliography: 28 titles.","PeriodicalId":49573,"journal":{"name":"Sbornik Mathematics","volume":"4 1","pages":"0"},"PeriodicalIF":0.8000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Sbornik Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4213/sm9700e","RegionNum":4,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS","Score":null,"Total":0}
引用次数: 0
Abstract
Algorithmic methods are developed that guarantee efficient complexity estimates for strongly convex-concave saddle-point problems in the case when one group of variables has a high dimension, while another has a rather low dimension (up to 100). These methods are based on reducing problems of this type to the minimization (maximization) problem for a convex (concave) functional with respect to one of the variables such that an approximate value of the gradient at an arbitrary point can be obtained with the required accuracy using an auxiliary optimization subproblem with respect to the other variable. It is proposed to use the ellipsoid method and Vaidya's method for low-dimensional problems and accelerated gradient methods with inexact information about the gradient or subgradient for high-dimensional problems. In the case when one group of variables, ranging over a hypercube, has a very low dimension (up to five), another proposed approach to strongly convex-concave saddle-point problems is rather efficient. This approach is based on a new version of a multidimensional analogue of Nesterov's method on a square (the multidimensional dichotomy method) with the possibility to use inexact values of the gradient of the objective functional. Bibliography: 28 titles.
期刊介绍:
The Russian original is rigorously refereed in Russia and the translations are carefully scrutinised and edited by the London Mathematical Society. The journal has always maintained the highest scientific level in a wide area of mathematics with special attention to current developments in:
Mathematical analysis
Ordinary differential equations
Partial differential equations
Mathematical physics
Geometry
Algebra
Functional analysis