HRA: A Multi-Criteria Framework for Ranking Metaheuristic Optimization Algorithms

Evgenia-Maria K. Goula, Dimitris G. Sotiropoulos
{"title":"HRA: A Multi-Criteria Framework for Ranking Metaheuristic Optimization Algorithms","authors":"Evgenia-Maria K. Goula, Dimitris G. Sotiropoulos","doi":"arxiv-2409.11617","DOIUrl":null,"url":null,"abstract":"Metaheuristic algorithms are essential for solving complex optimization\nproblems in different fields. However, the difficulty in comparing and rating\nthese algorithms remains due to the wide range of performance metrics and\nproblem dimensions usually involved. On the other hand, nonparametric\nstatistical methods and post hoc tests are time-consuming, especially when we\nonly need to identify the top performers among many algorithms. The\nHierarchical Rank Aggregation (HRA) algorithm aims to efficiently rank\nmetaheuristic algorithms based on their performance across many criteria and\ndimensions. The HRA employs a hierarchical framework that begins with\ncollecting performance metrics on various benchmark functions and dimensions.\nRank-based normalization is employed for each performance measure to ensure\ncomparability and the robust TOPSIS aggregation is applied to combine these\nrankings at several hierarchical levels, resulting in a comprehensive ranking\nof the algorithms. Our study uses data from the CEC 2017 competition to\ndemonstrate the robustness and efficacy of the HRA framework. It examines 30\nbenchmark functions and evaluates the performance of 13 metaheuristic\nalgorithms across five performance indicators in four distinct dimensions. This\npresentation highlights the potential of the HRA to enhance the interpretation\nof the comparative advantages and disadvantages of various algorithms by\nsimplifying practitioners' choices of the most appropriate algorithm for\ncertain optimization problems.","PeriodicalId":501291,"journal":{"name":"arXiv - CS - Performance","volume":"31 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Performance","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.11617","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Metaheuristic algorithms are essential for solving complex optimization problems in different fields. However, the difficulty in comparing and rating these algorithms remains due to the wide range of performance metrics and problem dimensions usually involved. On the other hand, nonparametric statistical methods and post hoc tests are time-consuming, especially when we only need to identify the top performers among many algorithms. The Hierarchical Rank Aggregation (HRA) algorithm aims to efficiently rank metaheuristic algorithms based on their performance across many criteria and dimensions. The HRA employs a hierarchical framework that begins with collecting performance metrics on various benchmark functions and dimensions. Rank-based normalization is employed for each performance measure to ensure comparability and the robust TOPSIS aggregation is applied to combine these rankings at several hierarchical levels, resulting in a comprehensive ranking of the algorithms. Our study uses data from the CEC 2017 competition to demonstrate the robustness and efficacy of the HRA framework. It examines 30 benchmark functions and evaluates the performance of 13 metaheuristic algorithms across five performance indicators in four distinct dimensions. This presentation highlights the potential of the HRA to enhance the interpretation of the comparative advantages and disadvantages of various algorithms by simplifying practitioners' choices of the most appropriate algorithm for certain optimization problems.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
HRA:元搜索优化算法排序的多标准框架
元启发式算法对于解决不同领域的复杂优化问题至关重要。然而,由于通常涉及多种性能指标和问题维度,对这些算法进行比较和评级仍然存在困难。另一方面,非参数统计方法和事后检验非常耗时,尤其是当我们只需要从众多算法中找出性能最好的算法时。分层排名聚合(HRA)算法旨在根据元启发式算法在多个标准和维度上的表现对其进行有效排名。HRA 采用分层框架,首先收集各种基准函数和维度的性能指标,然后对每个性能指标进行基于等级的归一化以确保可比性,最后采用稳健的 TOPSIS 聚合法将多个分层级别的排名结合起来,从而得出算法的综合排名。我们的研究使用了 2017 年 CEC 竞赛的数据来展示 HRA 框架的鲁棒性和有效性。它考察了 30 个基准函数,并从四个不同维度的五个性能指标评估了 13 种元搜索算法的性能。该报告强调了 HRA 的潜力,即通过简化实践者对特定优化问题最合适算法的选择,增强对各种算法优缺点的解释。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
HRA: A Multi-Criteria Framework for Ranking Metaheuristic Optimization Algorithms Temporal Load Imbalance on Ondes3D Seismic Simulator for Different Multicore Architectures Can Graph Reordering Speed Up Graph Neural Network Training? An Experimental Study The Landscape of GPU-Centric Communication A Global Perspective on the Past, Present, and Future of Video Streaming over Starlink
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1