Tassawar Ali, Hikmat Ullah Khan, Fawaz Khaled Alarfaj, Mohammed AlReshoodi
{"title":"混合深度学习和进化算法,实现准确的云计算工作量预测","authors":"Tassawar Ali, Hikmat Ullah Khan, Fawaz Khaled Alarfaj, Mohammed AlReshoodi","doi":"10.1007/s00607-024-01340-8","DOIUrl":null,"url":null,"abstract":"<p>Cloud computing offers demand-based allocation of required resources to its clients ensuring optimal use of resources in a cost-effective manner. However, due to the massive increase in demand for physical resources by datacenters cloud management suffers from inefficient resource management. To enhance efficiency by reducing resource setup time, workload prediction has become an active research area. It helps to make management decisions proactively and enables the cloud management system to better respond to spikes in the workload. This study proposes a hybrid model combining both state-of-the-art deep learning models and evolutionary algorithms for workload prediction. The proposed cluster-based differential evolution neural network model utilizes differential evolution for the optimization of feature weights of the deep neural network to predict the future workloads of a cloud datacenter. The proposed model uses a novel mutation strategy that clusters the population based on an agglomerative technique and chooses the best gene from randomly chosen clusters. Thus, the strategy creates a balance between the exploration and exploitation of the population and enables the model to avoid local optima and converge rapidly. The datasets used for the experiments are created from Google’s real-world traces and the Alibaba platform. The model is compared with backpropagation, Adam optimizer-based LSTM, and an evolutionary neural network-based three-mutation policy. We evaluated the performance of the proposed model in terms of root mean squared error in predicting the upcoming CPU, RAM, and BW usage. The proposed model achieved an error rate as low as 0.0002 to outperform the existing studies in the relevant literature. To further authenticate the results, we performed the statistical analysis of the obtained results in terms of R-squared, mean bias deviation, 90th percentile score, and Theil’s U statistics. The high accuracy and automaticity of the proposed model have paved the way for its application in diverse areas of cloud computing, including real-time applications.</p>","PeriodicalId":10718,"journal":{"name":"Computing","volume":"10 1","pages":""},"PeriodicalIF":3.3000,"publicationDate":"2024-08-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Hybrid deep learning and evolutionary algorithms for accurate cloud workload prediction\",\"authors\":\"Tassawar Ali, Hikmat Ullah Khan, Fawaz Khaled Alarfaj, Mohammed AlReshoodi\",\"doi\":\"10.1007/s00607-024-01340-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Cloud computing offers demand-based allocation of required resources to its clients ensuring optimal use of resources in a cost-effective manner. However, due to the massive increase in demand for physical resources by datacenters cloud management suffers from inefficient resource management. To enhance efficiency by reducing resource setup time, workload prediction has become an active research area. It helps to make management decisions proactively and enables the cloud management system to better respond to spikes in the workload. This study proposes a hybrid model combining both state-of-the-art deep learning models and evolutionary algorithms for workload prediction. The proposed cluster-based differential evolution neural network model utilizes differential evolution for the optimization of feature weights of the deep neural network to predict the future workloads of a cloud datacenter. The proposed model uses a novel mutation strategy that clusters the population based on an agglomerative technique and chooses the best gene from randomly chosen clusters. Thus, the strategy creates a balance between the exploration and exploitation of the population and enables the model to avoid local optima and converge rapidly. The datasets used for the experiments are created from Google’s real-world traces and the Alibaba platform. The model is compared with backpropagation, Adam optimizer-based LSTM, and an evolutionary neural network-based three-mutation policy. We evaluated the performance of the proposed model in terms of root mean squared error in predicting the upcoming CPU, RAM, and BW usage. The proposed model achieved an error rate as low as 0.0002 to outperform the existing studies in the relevant literature. To further authenticate the results, we performed the statistical analysis of the obtained results in terms of R-squared, mean bias deviation, 90th percentile score, and Theil’s U statistics. The high accuracy and automaticity of the proposed model have paved the way for its application in diverse areas of cloud computing, including real-time applications.</p>\",\"PeriodicalId\":10718,\"journal\":{\"name\":\"Computing\",\"volume\":\"10 1\",\"pages\":\"\"},\"PeriodicalIF\":3.3000,\"publicationDate\":\"2024-08-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1007/s00607-024-01340-8\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computing","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s00607-024-01340-8","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
摘要
云计算根据客户需求分配所需资源,确保以经济高效的方式优化使用资源。然而,由于数据中心对物理资源的需求大量增加,云计算管理存在资源管理效率低下的问题。为了通过减少资源设置时间来提高效率,工作量预测已成为一个活跃的研究领域。它有助于主动做出管理决策,使云管理系统能够更好地应对工作负载的激增。本研究提出了一种混合模型,将最先进的深度学习模型和进化算法相结合,用于工作量预测。所提出的基于集群的差分进化神经网络模型利用差分进化来优化深度神经网络的特征权重,从而预测云数据中心未来的工作负载。所提出的模型采用了一种新颖的突变策略,该策略基于聚类技术对种群进行聚类,并从随机选择的聚类中选择最佳基因。因此,该策略在种群的探索和利用之间建立了平衡,使模型能够避免局部最优并快速收敛。实验所使用的数据集来自谷歌的真实跟踪和阿里巴巴平台。我们将该模型与反向传播、基于亚当优化器的 LSTM 以及基于进化神经网络的三突变策略进行了比较。我们从均方根误差的角度评估了所提模型在预测即将到来的 CPU、RAM 和 BW 使用率方面的性能。拟议模型的误差率低至 0.0002,优于相关文献中的现有研究。为了进一步验证结果,我们对所获得的结果进行了统计分析,包括 R 方、平均偏差、第 90 百分位数得分和 Theil's U 统计量。所提模型的高准确性和自动性为其在云计算各领域(包括实时应用)的应用铺平了道路。
Hybrid deep learning and evolutionary algorithms for accurate cloud workload prediction
Cloud computing offers demand-based allocation of required resources to its clients ensuring optimal use of resources in a cost-effective manner. However, due to the massive increase in demand for physical resources by datacenters cloud management suffers from inefficient resource management. To enhance efficiency by reducing resource setup time, workload prediction has become an active research area. It helps to make management decisions proactively and enables the cloud management system to better respond to spikes in the workload. This study proposes a hybrid model combining both state-of-the-art deep learning models and evolutionary algorithms for workload prediction. The proposed cluster-based differential evolution neural network model utilizes differential evolution for the optimization of feature weights of the deep neural network to predict the future workloads of a cloud datacenter. The proposed model uses a novel mutation strategy that clusters the population based on an agglomerative technique and chooses the best gene from randomly chosen clusters. Thus, the strategy creates a balance between the exploration and exploitation of the population and enables the model to avoid local optima and converge rapidly. The datasets used for the experiments are created from Google’s real-world traces and the Alibaba platform. The model is compared with backpropagation, Adam optimizer-based LSTM, and an evolutionary neural network-based three-mutation policy. We evaluated the performance of the proposed model in terms of root mean squared error in predicting the upcoming CPU, RAM, and BW usage. The proposed model achieved an error rate as low as 0.0002 to outperform the existing studies in the relevant literature. To further authenticate the results, we performed the statistical analysis of the obtained results in terms of R-squared, mean bias deviation, 90th percentile score, and Theil’s U statistics. The high accuracy and automaticity of the proposed model have paved the way for its application in diverse areas of cloud computing, including real-time applications.
期刊介绍:
Computing publishes original papers, short communications and surveys on all fields of computing. The contributions should be written in English and may be of theoretical or applied nature, the essential criteria are computational relevance and systematic foundation of results.