{"title":"超越矩阵的特征值编程","authors":"Masaru Ito, Bruno F. Lourenço","doi":"10.1007/s10589-024-00591-7","DOIUrl":null,"url":null,"abstract":"<p>In this paper we analyze and solve eigenvalue programs, which consist of the task of minimizing a function subject to constraints on the “eigenvalues” of the decision variable. Here, by making use of the FTvN systems framework introduced by Gowda, we interpret “eigenvalues” in a broad fashion going beyond the usual eigenvalues of matrices. This allows us to shed new light on classical problems such as inverse eigenvalue problems and also leads to new applications. In particular, after analyzing and developing a simple projected gradient algorithm for general eigenvalue programs, we show that eigenvalue programs can be used to express what we call <i>vanishing quadratic constraints</i>. A vanishing quadratic constraint requires that a given system of convex quadratic inequalities be satisfied and at least a certain number of those inequalities must be tight. As a particular case, this includes the problem of finding a point <i>x</i> in the intersection of <i>m</i> ellipsoids in such a way that <i>x</i> is also in the boundary of at least <span>\\(\\ell \\)</span> of the ellipsoids, for some fixed <span>\\(\\ell > 0\\)</span>. At the end, we also present some numerical experiments.</p>","PeriodicalId":55227,"journal":{"name":"Computational Optimization and Applications","volume":"31 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Eigenvalue programming beyond matrices\",\"authors\":\"Masaru Ito, Bruno F. Lourenço\",\"doi\":\"10.1007/s10589-024-00591-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In this paper we analyze and solve eigenvalue programs, which consist of the task of minimizing a function subject to constraints on the “eigenvalues” of the decision variable. Here, by making use of the FTvN systems framework introduced by Gowda, we interpret “eigenvalues” in a broad fashion going beyond the usual eigenvalues of matrices. This allows us to shed new light on classical problems such as inverse eigenvalue problems and also leads to new applications. In particular, after analyzing and developing a simple projected gradient algorithm for general eigenvalue programs, we show that eigenvalue programs can be used to express what we call <i>vanishing quadratic constraints</i>. A vanishing quadratic constraint requires that a given system of convex quadratic inequalities be satisfied and at least a certain number of those inequalities must be tight. As a particular case, this includes the problem of finding a point <i>x</i> in the intersection of <i>m</i> ellipsoids in such a way that <i>x</i> is also in the boundary of at least <span>\\\\(\\\\ell \\\\)</span> of the ellipsoids, for some fixed <span>\\\\(\\\\ell > 0\\\\)</span>. At the end, we also present some numerical experiments.</p>\",\"PeriodicalId\":55227,\"journal\":{\"name\":\"Computational Optimization and Applications\",\"volume\":\"31 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-07-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Optimization and Applications\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s10589-024-00591-7\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Optimization and Applications","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s10589-024-00591-7","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0
摘要
在本文中,我们分析并求解了特征值程序,该程序包括在决策变量 "特征值 "的约束下最小化函数的任务。在这里,通过利用高达提出的 FTvN 系统框架,我们对 "特征值 "进行了广义的解释,超越了通常的矩阵特征值。这使我们能够为逆特征值问题等经典问题带来新的启示,同时也带来了新的应用。特别是,在分析和开发了一般特征值程序的简单投影梯度算法后,我们证明特征值程序可以用来表达我们称之为消失二次约束的问题。消失二次约束要求满足给定的凸二次不等式系统,并且其中至少有一定数量的不等式必须是严密的。作为一种特殊情况,这包括在 m 个椭圆的交点上找到一个点 x,使得 x 在某个固定的 \(\ell > 0\) 的情况下,也在至少 \(\ell \) 个椭圆的边界上。最后,我们还介绍了一些数值实验。
In this paper we analyze and solve eigenvalue programs, which consist of the task of minimizing a function subject to constraints on the “eigenvalues” of the decision variable. Here, by making use of the FTvN systems framework introduced by Gowda, we interpret “eigenvalues” in a broad fashion going beyond the usual eigenvalues of matrices. This allows us to shed new light on classical problems such as inverse eigenvalue problems and also leads to new applications. In particular, after analyzing and developing a simple projected gradient algorithm for general eigenvalue programs, we show that eigenvalue programs can be used to express what we call vanishing quadratic constraints. A vanishing quadratic constraint requires that a given system of convex quadratic inequalities be satisfied and at least a certain number of those inequalities must be tight. As a particular case, this includes the problem of finding a point x in the intersection of m ellipsoids in such a way that x is also in the boundary of at least \(\ell \) of the ellipsoids, for some fixed \(\ell > 0\). At the end, we also present some numerical experiments.
期刊介绍:
Computational Optimization and Applications is a peer reviewed journal that is committed to timely publication of research and tutorial papers on the analysis and development of computational algorithms and modeling technology for optimization. Algorithms either for general classes of optimization problems or for more specific applied problems are of interest. Stochastic algorithms as well as deterministic algorithms will be considered. Papers that can provide both theoretical analysis, along with carefully designed computational experiments, are particularly welcome.
Topics of interest include, but are not limited to the following:
Large Scale Optimization,
Unconstrained Optimization,
Linear Programming,
Quadratic Programming Complementarity Problems, and Variational Inequalities,
Constrained Optimization,
Nondifferentiable Optimization,
Integer Programming,
Combinatorial Optimization,
Stochastic Optimization,
Multiobjective Optimization,
Network Optimization,
Complexity Theory,
Approximations and Error Analysis,
Parametric Programming and Sensitivity Analysis,
Parallel Computing, Distributed Computing, and Vector Processing,
Software, Benchmarks, Numerical Experimentation and Comparisons,
Modelling Languages and Systems for Optimization,
Automatic Differentiation,
Applications in Engineering, Finance, Optimal Control, Optimal Design, Operations Research,
Transportation, Economics, Communications, Manufacturing, and Management Science.