The Whale Optimization Algorithm for Hyperparameter Optimization in Network-Wide Traffic Speed Prediction

Zhang-Han Zhuang, Ming-Chao Chiang
{"title":"The Whale Optimization Algorithm for Hyperparameter Optimization in Network-Wide Traffic Speed Prediction","authors":"Zhang-Han Zhuang, Ming-Chao Chiang","doi":"10.1145/3440943.3444729","DOIUrl":null,"url":null,"abstract":"Since there are way too many possible combinations of hyperparameters for training a desired deep neural network (DNN) model, finding out a set of suitable values for them is typically a difficult topic for the researchers when they use DNN for solving forecasting problems. In addition to manual tuning and trial-and-error for hyperparameters, how to automatically determine the values of hyperparameters has become a critical problem in recent years. In this study, we present a metaheuristic algorithm based on the whale optimization algorithm (WOA) to select suitable hyperparameters for the DNN because WOA demonstrates brilliant convergence speed in many optimization problems and the local optima avoidance mechanism is devised to prevent the searches from trapping into suboptimal solution easily. To validate the feasibility of the proposed algorithm, we compared it with several state-of-the-art hyperparameter selection algorithms for DNN in solving the network-wide traffic speed prediction problem. The experimental results show that WOA not only behaves much more stable but also outperforms all the other hyperparameter selection algorithms compared in this study in terms of the mean square error, mean average error, and mean average percentage error.","PeriodicalId":310247,"journal":{"name":"Proceedings of the 2020 ACM International Conference on Intelligent Computing and its Emerging Applications","volume":"224 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 ACM International Conference on Intelligent Computing and its Emerging Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3440943.3444729","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Since there are way too many possible combinations of hyperparameters for training a desired deep neural network (DNN) model, finding out a set of suitable values for them is typically a difficult topic for the researchers when they use DNN for solving forecasting problems. In addition to manual tuning and trial-and-error for hyperparameters, how to automatically determine the values of hyperparameters has become a critical problem in recent years. In this study, we present a metaheuristic algorithm based on the whale optimization algorithm (WOA) to select suitable hyperparameters for the DNN because WOA demonstrates brilliant convergence speed in many optimization problems and the local optima avoidance mechanism is devised to prevent the searches from trapping into suboptimal solution easily. To validate the feasibility of the proposed algorithm, we compared it with several state-of-the-art hyperparameter selection algorithms for DNN in solving the network-wide traffic speed prediction problem. The experimental results show that WOA not only behaves much more stable but also outperforms all the other hyperparameter selection algorithms compared in this study in terms of the mean square error, mean average error, and mean average percentage error.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
全网流量速度预测中超参数优化的鲸鱼优化算法
由于训练所需的深度神经网络(DNN)模型有太多可能的超参数组合,因此当研究人员使用DNN解决预测问题时,为它们找到一组合适的值通常是一个难题。除了对超参数进行人工调优和试错之外,如何自动确定超参数的值已成为近年来的一个关键问题。在本研究中,我们提出了一种基于鲸鱼优化算法(WOA)的元启发式算法来选择适合深度神经网络的超参数,因为WOA在许多优化问题中具有出色的收敛速度,并且设计了局部最优避免机制来防止搜索容易陷入次优解。为了验证所提出算法的可行性,我们将其与几种最先进的深度神经网络超参数选择算法进行比较,以解决网络范围内的交通速度预测问题。实验结果表明,WOA不仅表现得更加稳定,而且在均方误差、平均误差和平均百分比误差方面都优于本研究中比较的所有其他超参数选择算法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
An Image Processing Approach for Improving the Recognition of Cluster-like Spheroidized Carbides XGBoost based Packer Identification study using Entry point Machine Learning-Based Profiling Attack Method in RSA Prime Multiplication A Classification method of Fake News based on Ensemble Learning Intelligent Controlling System in Aquaculture
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1