{"title":"基于 HP 增强拉格朗日函数的分布式非凸优化神经动力优化方法。","authors":"","doi":"10.1016/j.neunet.2024.106791","DOIUrl":null,"url":null,"abstract":"<div><div>This paper develops a neurodynamic model for distributed nonconvex-constrained optimization. In the distributed constrained optimization model, the objective function and inequality constraints do not need to be convex, and equality constraints do not need to be affine. A Hestenes–Powell augmented Lagrangian function for handling the nonconvexity is established, and a neurodynamic system is developed based on this. It is proved that it is stable at a local optimal solution of the optimization model. Two illustrative examples are provided to evaluate the enhanced stability and optimality of the developed neurodynamic systems.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":null,"pages":null},"PeriodicalIF":6.0000,"publicationDate":"2024-10-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A neurodynamic optimization approach to distributed nonconvex optimization based on an HP augmented Lagrangian function\",\"authors\":\"\",\"doi\":\"10.1016/j.neunet.2024.106791\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>This paper develops a neurodynamic model for distributed nonconvex-constrained optimization. In the distributed constrained optimization model, the objective function and inequality constraints do not need to be convex, and equality constraints do not need to be affine. A Hestenes–Powell augmented Lagrangian function for handling the nonconvexity is established, and a neurodynamic system is developed based on this. It is proved that it is stable at a local optimal solution of the optimization model. Two illustrative examples are provided to evaluate the enhanced stability and optimality of the developed neurodynamic systems.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2024-10-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608024007159\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608024007159","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
A neurodynamic optimization approach to distributed nonconvex optimization based on an HP augmented Lagrangian function
This paper develops a neurodynamic model for distributed nonconvex-constrained optimization. In the distributed constrained optimization model, the objective function and inequality constraints do not need to be convex, and equality constraints do not need to be affine. A Hestenes–Powell augmented Lagrangian function for handling the nonconvexity is established, and a neurodynamic system is developed based on this. It is proved that it is stable at a local optimal solution of the optimization model. Two illustrative examples are provided to evaluate the enhanced stability and optimality of the developed neurodynamic systems.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.