用于预测云中 CPU 利用率的周期性提升回归方法

Khanh Nguyen Quoc, Van Tong, Cuong Dao, Tuyen Ngoc Le, Duc Tran
{"title":"用于预测云中 CPU 利用率的周期性提升回归方法","authors":"Khanh Nguyen Quoc, Van Tong, Cuong Dao, Tuyen Ngoc Le, Duc Tran","doi":"10.1007/s11227-024-06451-9","DOIUrl":null,"url":null,"abstract":"<p>Predicting CPU usage is crucial to cloud resource management. Precise CPU prediction, however, is a tough challenge due to the variable and dynamic nature of CPUs. In this paper, we introduce TrAdaBoost.WLP, a novel regression transfer boosting method that employs Long Short-Term Memory (LSTM) networks for CPU consumption prediction. Concretely, a dedicated Periodicity-aware LSTM (PA-LSTM) model is specifically developed to take into account the use of periodically repeated patterns in time series data while making predictions. To adjust for variations in CPU demands, multiple PA-LSTMs are trained and concatenated in TrAdaBoost.WLP using a boosting mechanism. TrAdaBoost.WLP and benchmarks have been thoroughly evaluated on two datasets: 160 Microsoft Azure VMs and 8 Google cluster traces. The experimental results show that TrAdaBoost.WLP can produce promising performance, improving by 32.4% and 59.3% in terms of mean squared error compared to the standard Probabilistic LSTM and ARIMA.</p>","PeriodicalId":501596,"journal":{"name":"The Journal of Supercomputing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Boosted regression for predicting CPU utilization in the cloud with periodicity\",\"authors\":\"Khanh Nguyen Quoc, Van Tong, Cuong Dao, Tuyen Ngoc Le, Duc Tran\",\"doi\":\"10.1007/s11227-024-06451-9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Predicting CPU usage is crucial to cloud resource management. Precise CPU prediction, however, is a tough challenge due to the variable and dynamic nature of CPUs. In this paper, we introduce TrAdaBoost.WLP, a novel regression transfer boosting method that employs Long Short-Term Memory (LSTM) networks for CPU consumption prediction. Concretely, a dedicated Periodicity-aware LSTM (PA-LSTM) model is specifically developed to take into account the use of periodically repeated patterns in time series data while making predictions. To adjust for variations in CPU demands, multiple PA-LSTMs are trained and concatenated in TrAdaBoost.WLP using a boosting mechanism. TrAdaBoost.WLP and benchmarks have been thoroughly evaluated on two datasets: 160 Microsoft Azure VMs and 8 Google cluster traces. The experimental results show that TrAdaBoost.WLP can produce promising performance, improving by 32.4% and 59.3% in terms of mean squared error compared to the standard Probabilistic LSTM and ARIMA.</p>\",\"PeriodicalId\":501596,\"journal\":{\"name\":\"The Journal of Supercomputing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Journal of Supercomputing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s11227-024-06451-9\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Journal of Supercomputing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s11227-024-06451-9","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

预测 CPU 使用情况对云资源管理至关重要。然而,由于 CPU 的可变性和动态性,CPU 的精确预测是一项艰巨的挑战。在本文中,我们介绍了 TrAdaBoost.WLP,这是一种新颖的回归转移提升方法,采用长短期记忆(LSTM)网络进行 CPU 消耗预测。具体来说,我们专门开发了一个周期性感知 LSTM(PA-LSTM)模型,以便在进行预测时考虑到时间序列数据中周期性重复模式的使用。为了适应 CPU 需求的变化,TrAdaBoost.WLP 利用增强机制训练并连接了多个 PA-LSTM 模型。TrAdaBoost.WLP 和基准在两个数据集上进行了全面评估:160 个微软 Azure 虚拟机和 8 个谷歌集群痕迹。实验结果表明,TrAdaBoost.WLP 能产生令人满意的性能,与标准概率 LSTM 和 ARIMA 相比,平均平方误差分别提高了 32.4% 和 59.3%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Boosted regression for predicting CPU utilization in the cloud with periodicity

Predicting CPU usage is crucial to cloud resource management. Precise CPU prediction, however, is a tough challenge due to the variable and dynamic nature of CPUs. In this paper, we introduce TrAdaBoost.WLP, a novel regression transfer boosting method that employs Long Short-Term Memory (LSTM) networks for CPU consumption prediction. Concretely, a dedicated Periodicity-aware LSTM (PA-LSTM) model is specifically developed to take into account the use of periodically repeated patterns in time series data while making predictions. To adjust for variations in CPU demands, multiple PA-LSTMs are trained and concatenated in TrAdaBoost.WLP using a boosting mechanism. TrAdaBoost.WLP and benchmarks have been thoroughly evaluated on two datasets: 160 Microsoft Azure VMs and 8 Google cluster traces. The experimental results show that TrAdaBoost.WLP can produce promising performance, improving by 32.4% and 59.3% in terms of mean squared error compared to the standard Probabilistic LSTM and ARIMA.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
A quadratic regression model to quantify certain latest corona treatment drug molecules based on coindices of M-polynomial Data integration from traditional to big data: main features and comparisons of ETL approaches End-to-end probability analysis method for multi-core distributed systems A cloud computing approach to superscale colored traveling salesman problems Approximating neural distinguishers using differential-linear imbalance
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1