针对非平滑非凸多目标优化的米夫林线搜索子梯度有效算法

IF 6 2区 管理学 Q1 OPERATIONS RESEARCH & MANAGEMENT SCIENCE European Journal of Operational Research Pub Date : 2024-07-22 DOI:10.1016/j.ejor.2024.07.019
Morteza Maleknia , Majid Soleimani-damaneh
{"title":"针对非平滑非凸多目标优化的米夫林线搜索子梯度有效算法","authors":"Morteza Maleknia ,&nbsp;Majid Soleimani-damaneh","doi":"10.1016/j.ejor.2024.07.019","DOIUrl":null,"url":null,"abstract":"<div><p>We propose a descent subgradient algorithm for unconstrained nonsmooth nonconvex multiobjective optimization problems. To find a descent direction, we present an iterative process that efficiently approximates the <span><math><mi>ɛ</mi></math></span>-subdifferential of each objective function. To this end, we develop a new variant of Mifflin’s line search in which the subgradients are arbitrary and its finite convergence is proved under a semismooth assumption. To reduce the number of subgradient evaluations, we employ a backtracking line search that identifies the objectives requiring an improvement in the current approximation of the <span><math><mi>ɛ</mi></math></span>-subdifferential. Meanwhile, for the remaining objectives, new subgradients are not computed. Unlike bundle-type methods, the proposed approach can handle nonconvexity without the need for algorithmic adjustments. Moreover, the quadratic subproblems have a simple structure, and hence the method is easy to implement. We analyze the global convergence of the proposed method and prove that any accumulation point of the generated sequence satisfies a necessary Pareto optimality condition. Furthermore, our convergence analysis addresses a theoretical challenge in a recently developed subgradient method. Through numerical experiments, we observe the practical capability of the proposed method and evaluate its efficiency when applied to a diverse range of nonsmooth test problems.</p></div>","PeriodicalId":55161,"journal":{"name":"European Journal of Operational Research","volume":null,"pages":null},"PeriodicalIF":6.0000,"publicationDate":"2024-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An effective subgradient algorithm via Mifflin’s line search for nonsmooth nonconvex multiobjective optimization\",\"authors\":\"Morteza Maleknia ,&nbsp;Majid Soleimani-damaneh\",\"doi\":\"10.1016/j.ejor.2024.07.019\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>We propose a descent subgradient algorithm for unconstrained nonsmooth nonconvex multiobjective optimization problems. To find a descent direction, we present an iterative process that efficiently approximates the <span><math><mi>ɛ</mi></math></span>-subdifferential of each objective function. To this end, we develop a new variant of Mifflin’s line search in which the subgradients are arbitrary and its finite convergence is proved under a semismooth assumption. To reduce the number of subgradient evaluations, we employ a backtracking line search that identifies the objectives requiring an improvement in the current approximation of the <span><math><mi>ɛ</mi></math></span>-subdifferential. Meanwhile, for the remaining objectives, new subgradients are not computed. Unlike bundle-type methods, the proposed approach can handle nonconvexity without the need for algorithmic adjustments. Moreover, the quadratic subproblems have a simple structure, and hence the method is easy to implement. We analyze the global convergence of the proposed method and prove that any accumulation point of the generated sequence satisfies a necessary Pareto optimality condition. Furthermore, our convergence analysis addresses a theoretical challenge in a recently developed subgradient method. Through numerical experiments, we observe the practical capability of the proposed method and evaluate its efficiency when applied to a diverse range of nonsmooth test problems.</p></div>\",\"PeriodicalId\":55161,\"journal\":{\"name\":\"European Journal of Operational Research\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2024-07-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"European Journal of Operational Research\",\"FirstCategoryId\":\"91\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0377221724005605\",\"RegionNum\":2,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"OPERATIONS RESEARCH & MANAGEMENT SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"European Journal of Operational Research","FirstCategoryId":"91","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0377221724005605","RegionNum":2,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"OPERATIONS RESEARCH & MANAGEMENT SCIENCE","Score":null,"Total":0}
引用次数: 0

摘要

我们提出了一种针对无约束非光滑非凸多目标优化问题的子梯度下降算法。为了找到下降方向,我们提出了一种迭代过程,可以有效逼近每个目标函数的-次微分。为此,我们开发了一种米夫林直线搜索的新变体,其中的子梯度是任意的,并在半光滑假设下证明了其有限收敛性。为了减少子梯度评估的次数,我们采用了一种回溯线性搜索方法,它能识别出需要改进当前近似子微分的目标。同时,对于其余目标,不计算新的子梯度。与捆绑式方法不同,建议的方法无需调整算法即可处理非凸性问题。此外,二次子问题结构简单,因此该方法易于实现。我们分析了所提方法的全局收敛性,并证明所生成序列的任何累积点都满足必要的帕累托最优条件。此外,我们的收敛性分析解决了最近开发的子梯度方法中的一个理论难题。通过数值实验,我们观察到了所提方法的实用能力,并评估了它在应用于各种非光滑测试问题时的效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
An effective subgradient algorithm via Mifflin’s line search for nonsmooth nonconvex multiobjective optimization

We propose a descent subgradient algorithm for unconstrained nonsmooth nonconvex multiobjective optimization problems. To find a descent direction, we present an iterative process that efficiently approximates the ɛ-subdifferential of each objective function. To this end, we develop a new variant of Mifflin’s line search in which the subgradients are arbitrary and its finite convergence is proved under a semismooth assumption. To reduce the number of subgradient evaluations, we employ a backtracking line search that identifies the objectives requiring an improvement in the current approximation of the ɛ-subdifferential. Meanwhile, for the remaining objectives, new subgradients are not computed. Unlike bundle-type methods, the proposed approach can handle nonconvexity without the need for algorithmic adjustments. Moreover, the quadratic subproblems have a simple structure, and hence the method is easy to implement. We analyze the global convergence of the proposed method and prove that any accumulation point of the generated sequence satisfies a necessary Pareto optimality condition. Furthermore, our convergence analysis addresses a theoretical challenge in a recently developed subgradient method. Through numerical experiments, we observe the practical capability of the proposed method and evaluate its efficiency when applied to a diverse range of nonsmooth test problems.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
European Journal of Operational Research
European Journal of Operational Research 管理科学-运筹学与管理科学
CiteScore
11.90
自引率
9.40%
发文量
786
审稿时长
8.2 months
期刊介绍: The European Journal of Operational Research (EJOR) publishes high quality, original papers that contribute to the methodology of operational research (OR) and to the practice of decision making.
期刊最新文献
Prelim p. 2; First issue - Editorial Board Editorial Board An exact method for the two-echelon split-delivery vehicle routing problem for liquefied natural gas delivery with the boil-off phenomenon The demand for hedging of oil producers: A tale of risk and regret Data-driven dynamic police patrolling: An efficient Monte Carlo tree search
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1