David Applegate, Mateo Díaz, Haihao Lu, Miles Lubin
{"title":"Infeasibility Detection with Primal-Dual Hybrid Gradient for Large-Scale Linear Programming","authors":"David Applegate, Mateo Díaz, Haihao Lu, Miles Lubin","doi":"10.1137/22m1510467","DOIUrl":null,"url":null,"abstract":"SIAM Journal on Optimization, Volume 34, Issue 1, Page 459-484, March 2024. <br/> Abstract. We study the problem of detecting infeasibility of large-scale linear programming problems using the primal-dual hybrid gradient (PDHG) method of Chambolle and Pock [J. Math. Imaging Vision, 40 (2011), pp. 120–145]. The literature on PDHG has focused chiefly on problems with at least one optimal solution. We show that when the problem is infeasible or unbounded, the iterates diverge at a controlled rate toward a well-defined ray. In turn, the direction of such a ray recovers infeasibility certificates. Based on this fact, we propose a simple way to extract approximate infeasibility certificates from the iterates of PDHG. We study three sequences that converge to certificates: the difference of iterates, the normalized iterates, and the normalized average. All of them are easy to compute and suitable for large-scale problems. We show that the normalized iterates and normalized averages achieve a convergence rate of [math]. This rate is general and applies to any fixed-point iteration of a nonexpansive operator. Thus, it is a result of independent interest that goes well beyond our setting. Finally, we show that, under nondegeneracy assumptions, the iterates of PDHG identify the active set of an auxiliary feasible problem in finite time, which ensures that the difference of iterates exhibits eventual linear convergence. These results provide a theoretical justification for infeasibility detection in the newly developed linear programming solver PDLP.","PeriodicalId":49529,"journal":{"name":"SIAM Journal on Optimization","volume":"26 1","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2024-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SIAM Journal on Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1137/22m1510467","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
Abstract
SIAM Journal on Optimization, Volume 34, Issue 1, Page 459-484, March 2024. Abstract. We study the problem of detecting infeasibility of large-scale linear programming problems using the primal-dual hybrid gradient (PDHG) method of Chambolle and Pock [J. Math. Imaging Vision, 40 (2011), pp. 120–145]. The literature on PDHG has focused chiefly on problems with at least one optimal solution. We show that when the problem is infeasible or unbounded, the iterates diverge at a controlled rate toward a well-defined ray. In turn, the direction of such a ray recovers infeasibility certificates. Based on this fact, we propose a simple way to extract approximate infeasibility certificates from the iterates of PDHG. We study three sequences that converge to certificates: the difference of iterates, the normalized iterates, and the normalized average. All of them are easy to compute and suitable for large-scale problems. We show that the normalized iterates and normalized averages achieve a convergence rate of [math]. This rate is general and applies to any fixed-point iteration of a nonexpansive operator. Thus, it is a result of independent interest that goes well beyond our setting. Finally, we show that, under nondegeneracy assumptions, the iterates of PDHG identify the active set of an auxiliary feasible problem in finite time, which ensures that the difference of iterates exhibits eventual linear convergence. These results provide a theoretical justification for infeasibility detection in the newly developed linear programming solver PDLP.
期刊介绍:
The SIAM Journal on Optimization contains research articles on the theory and practice of optimization. The areas addressed include linear and quadratic programming, convex programming, nonlinear programming, complementarity problems, stochastic optimization, combinatorial optimization, integer programming, and convex, nonsmooth and variational analysis. Contributions may emphasize optimization theory, algorithms, software, computational practice, applications, or the links between these subjects.