基于精英对立学习的多策略改进多目标Harris Hawk优化算法

Fulin Tian, Jiayang Wang, Fei Chu, Lin Zhou
{"title":"基于精英对立学习的多策略改进多目标Harris Hawk优化算法","authors":"Fulin Tian, Jiayang Wang, Fei Chu, Lin Zhou","doi":"10.1145/3590003.3590030","DOIUrl":null,"url":null,"abstract":"Abstract: To make up for the deficiencies of the Harris hawk optimization algorithm (HHO) in solving multi-objective optimization problems with low algorithm accuracy, slow rate of convergence, and easily fall into the trap of local optima, a multi-strategy improved multi-objective Harris hawk optimization algorithm with elite opposition-based learning (MO-EMHHO) is proposed. First, the population is initialized by Sobol sequences to increase population diversity. Second, incorporate the elite backward learning strategy to improve population diversity and quality. Further, an external profile maintenance method based on an adaptive grid strategy is proposed to make the solution better contracted to the real Pareto frontier. Subsequently, optimize the update strategy of the original algorithm in a non-linear energy update way to improve the exploration and development of the algorithm. Finally, improving the diversity of the algorithm and the uniformity of the solution set using an adaptive variation strategy based on Gaussian random wandering. Experimental comparison of the multi-objective particle swarm algorithm (MOPSO), multi-objective gray wolf algorithm (MOGWO), and multi-objective Harris Hawk algorithm (MOHHO) on the commonly used benchmark functions shows that the MO-EMHHO outperforms the other compared algorithms in terms of optimization seeking accuracy, convergence speed and stability, and provides a new solution to the multi-objective optimization problem.","PeriodicalId":340225,"journal":{"name":"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-03-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multi-strategy Improved Multi-objective Harris Hawk Optimization Algorithm with Elite Opposition-based Learning\",\"authors\":\"Fulin Tian, Jiayang Wang, Fei Chu, Lin Zhou\",\"doi\":\"10.1145/3590003.3590030\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract: To make up for the deficiencies of the Harris hawk optimization algorithm (HHO) in solving multi-objective optimization problems with low algorithm accuracy, slow rate of convergence, and easily fall into the trap of local optima, a multi-strategy improved multi-objective Harris hawk optimization algorithm with elite opposition-based learning (MO-EMHHO) is proposed. First, the population is initialized by Sobol sequences to increase population diversity. Second, incorporate the elite backward learning strategy to improve population diversity and quality. Further, an external profile maintenance method based on an adaptive grid strategy is proposed to make the solution better contracted to the real Pareto frontier. Subsequently, optimize the update strategy of the original algorithm in a non-linear energy update way to improve the exploration and development of the algorithm. Finally, improving the diversity of the algorithm and the uniformity of the solution set using an adaptive variation strategy based on Gaussian random wandering. Experimental comparison of the multi-objective particle swarm algorithm (MOPSO), multi-objective gray wolf algorithm (MOGWO), and multi-objective Harris Hawk algorithm (MOHHO) on the commonly used benchmark functions shows that the MO-EMHHO outperforms the other compared algorithms in terms of optimization seeking accuracy, convergence speed and stability, and provides a new solution to the multi-objective optimization problem.\",\"PeriodicalId\":340225,\"journal\":{\"name\":\"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning\",\"volume\":\"18 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-03-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3590003.3590030\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 2nd Asia Conference on Algorithms, Computing and Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3590003.3590030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

摘要针对Harris hawk优化算法(HHO)在解决多目标优化问题时算法精度低、收敛速度慢、易陷入局部最优陷阱等缺点,提出了一种基于精英对抗学习的多策略改进多目标Harris hawk优化算法(MO-EMHHO)。首先,利用Sobol序列对种群进行初始化,增加种群多样性;第二,融入精英落后学习策略,提高人口多样性和素质。在此基础上,提出了一种基于自适应网格策略的外部轮廓维护方法,使解更好地收缩到实际帕累托边界。随后,以非线性能量更新的方式对原算法的更新策略进行优化,提高算法的探索和发展。最后,采用基于高斯随机漫游的自适应变异策略提高了算法的多样性和解集的均匀性。在常用的基准函数上对多目标粒子群算法(MOPSO)、多目标灰狼算法(MOGWO)和多目标哈里斯鹰算法(MOHHO)进行了实验比较,结果表明,MO-EMHHO在寻优精度、收敛速度和稳定性方面均优于其他被比较算法,为多目标优化问题提供了一种新的解决方案。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Multi-strategy Improved Multi-objective Harris Hawk Optimization Algorithm with Elite Opposition-based Learning
Abstract: To make up for the deficiencies of the Harris hawk optimization algorithm (HHO) in solving multi-objective optimization problems with low algorithm accuracy, slow rate of convergence, and easily fall into the trap of local optima, a multi-strategy improved multi-objective Harris hawk optimization algorithm with elite opposition-based learning (MO-EMHHO) is proposed. First, the population is initialized by Sobol sequences to increase population diversity. Second, incorporate the elite backward learning strategy to improve population diversity and quality. Further, an external profile maintenance method based on an adaptive grid strategy is proposed to make the solution better contracted to the real Pareto frontier. Subsequently, optimize the update strategy of the original algorithm in a non-linear energy update way to improve the exploration and development of the algorithm. Finally, improving the diversity of the algorithm and the uniformity of the solution set using an adaptive variation strategy based on Gaussian random wandering. Experimental comparison of the multi-objective particle swarm algorithm (MOPSO), multi-objective gray wolf algorithm (MOGWO), and multi-objective Harris Hawk algorithm (MOHHO) on the commonly used benchmark functions shows that the MO-EMHHO outperforms the other compared algorithms in terms of optimization seeking accuracy, convergence speed and stability, and provides a new solution to the multi-objective optimization problem.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
An Interpretable Brain Network Atlas-Based Hybrid Model for Mild Cognitive Impairment Progression Prediction Heart Sound Classification Algorithm Based on Sub-band Statistics and Time-frequency Fusion Features An Unmanned Lane Detection Algorithm Using Deep Learning and Ordered Test Sets Strategy Federated Learning-Based Intrusion Detection Method for Smart Grid A U-Net based Self-Supervised Image Generation Model Applying PCA using Small Datasets
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1