{"title":"The appeals of quadratic majorization–minimization","authors":"Marc C. Robini, Lihui Wang, Yuemin Zhu","doi":"10.1007/s10898-023-01361-1","DOIUrl":null,"url":null,"abstract":"<p>Majorization–minimization (MM) is a versatile optimization technique that operates on surrogate functions satisfying tangency and domination conditions. Our focus is on differentiable optimization using inexact MM with quadratic surrogates, which amounts to approximately solving a sequence of symmetric positive definite systems. We begin by investigating the convergence properties of this process, from subconvergence to R-linear convergence, with emphasis on tame objectives. Then we provide a numerically stable implementation based on truncated conjugate gradient. Applications to multidimensional scaling and regularized inversion are discussed and illustrated through numerical experiments on graph layout and X-ray tomography. In the end, quadratic MM not only offers solid guarantees of convergence and stability, but is robust to the choice of its control parameters.</p>","PeriodicalId":15961,"journal":{"name":"Journal of Global Optimization","volume":"171 1","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2024-01-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Global Optimization","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10898-023-01361-1","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0
Abstract
Majorization–minimization (MM) is a versatile optimization technique that operates on surrogate functions satisfying tangency and domination conditions. Our focus is on differentiable optimization using inexact MM with quadratic surrogates, which amounts to approximately solving a sequence of symmetric positive definite systems. We begin by investigating the convergence properties of this process, from subconvergence to R-linear convergence, with emphasis on tame objectives. Then we provide a numerically stable implementation based on truncated conjugate gradient. Applications to multidimensional scaling and regularized inversion are discussed and illustrated through numerical experiments on graph layout and X-ray tomography. In the end, quadratic MM not only offers solid guarantees of convergence and stability, but is robust to the choice of its control parameters.
主要化-最小化(MM)是一种通用的优化技术,可对满足切线和支配条件的代用函数进行操作。我们的重点是使用二次代函数的非精确 MM 进行可微分优化,这相当于近似求解一系列对称正定系统。我们首先研究了这一过程的收敛特性,从亚收敛到 R 线性收敛,重点是驯服目标。然后,我们提供了一种基于截断共轭梯度的数值稳定实现方法。我们讨论了多维缩放和正则化反演的应用,并通过图形布局和 X 射线断层扫描的数值实验进行了说明。最后,二次 MM 不仅在收敛性和稳定性方面提供了可靠保证,而且对其控制参数的选择也很稳健。
期刊介绍:
The Journal of Global Optimization publishes carefully refereed papers that encompass theoretical, computational, and applied aspects of global optimization. While the focus is on original research contributions dealing with the search for global optima of non-convex, multi-extremal problems, the journal’s scope covers optimization in the widest sense, including nonlinear, mixed integer, combinatorial, stochastic, robust, multi-objective optimization, computational geometry, and equilibrium problems. Relevant works on data-driven methods and optimization-based data mining are of special interest.
In addition to papers covering theory and algorithms of global optimization, the journal publishes significant papers on numerical experiments, new testbeds, and applications in engineering, management, and the sciences. Applications of particular interest include healthcare, computational biochemistry, energy systems, telecommunications, and finance. Apart from full-length articles, the journal features short communications on both open and solved global optimization problems. It also offers reviews of relevant books and publishes special issues.