基于强化学习的地理分布式数据中心差异化服务自适应资源管理

Xiaojie Zhou, Kun Wang, Weijia Jia, M. Guo
{"title":"基于强化学习的地理分布式数据中心差异化服务自适应资源管理","authors":"Xiaojie Zhou, Kun Wang, Weijia Jia, M. Guo","doi":"10.1109/IWQoS.2017.7969161","DOIUrl":null,"url":null,"abstract":"For better service provision and utilization of renewable energy, Internet service providers have already built their data centers in geographically distributed locations. These companies balance quality of service (QoS) revenue and power consumption by migrating virtual machines (VMs) and allocating the resource of servers adaptively. However, existing approaches model the QoS revenue by service-level agreement (SLA) violation, and ignore the network communication cost and immigration time. In this paper, we propose a reinforcement learning-based adaptive resource management algorithm, which aims to get the balance between QoS revenue and power consumption. Our algorithm does not need to assume prior distribution of resource requirements, and is robust in actual workload. It outperforms other existing approaches in three aspects: 1) The QoS revenue is directly modeled by differentiated revenue of different tasks, instead of using SLA violation. 2) For geo-distributed data centers, the time spent on VM migration and network communication cost are taken into consideration. 3) The information storage and random action selection of reinforcement learning algorithms are optimized for rapid decision making. Experiments show that our proposed algorithm is more robust than the existing algorithms. Besides, the power consumption of our algorithm is around 13.3% and 9.6% better than the existing algorithms in non-differentiated and differentiated services.","PeriodicalId":422861,"journal":{"name":"2017 IEEE/ACM 25th International Symposium on Quality of Service (IWQoS)","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"53","resultStr":"{\"title\":\"Reinforcement learning-based adaptive resource management of differentiated services in geo-distributed data centers\",\"authors\":\"Xiaojie Zhou, Kun Wang, Weijia Jia, M. Guo\",\"doi\":\"10.1109/IWQoS.2017.7969161\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"For better service provision and utilization of renewable energy, Internet service providers have already built their data centers in geographically distributed locations. These companies balance quality of service (QoS) revenue and power consumption by migrating virtual machines (VMs) and allocating the resource of servers adaptively. However, existing approaches model the QoS revenue by service-level agreement (SLA) violation, and ignore the network communication cost and immigration time. In this paper, we propose a reinforcement learning-based adaptive resource management algorithm, which aims to get the balance between QoS revenue and power consumption. Our algorithm does not need to assume prior distribution of resource requirements, and is robust in actual workload. It outperforms other existing approaches in three aspects: 1) The QoS revenue is directly modeled by differentiated revenue of different tasks, instead of using SLA violation. 2) For geo-distributed data centers, the time spent on VM migration and network communication cost are taken into consideration. 3) The information storage and random action selection of reinforcement learning algorithms are optimized for rapid decision making. Experiments show that our proposed algorithm is more robust than the existing algorithms. Besides, the power consumption of our algorithm is around 13.3% and 9.6% better than the existing algorithms in non-differentiated and differentiated services.\",\"PeriodicalId\":422861,\"journal\":{\"name\":\"2017 IEEE/ACM 25th International Symposium on Quality of Service (IWQoS)\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"53\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE/ACM 25th International Symposium on Quality of Service (IWQoS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IWQoS.2017.7969161\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE/ACM 25th International Symposium on Quality of Service (IWQoS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWQoS.2017.7969161","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 53

摘要

为了更好地提供服务和利用可再生能源,互联网服务提供商已经在地理上分散的位置建立了他们的数据中心。这些公司通过迁移虚拟机(vm)和自适应地分配服务器资源来平衡服务质量(QoS)收入和功耗。然而,现有的方法通过违反服务水平协议(SLA)来建模QoS收益,而忽略了网络通信成本和迁移时间。在本文中,我们提出了一种基于强化学习的自适应资源管理算法,其目的是在QoS收益和功耗之间取得平衡。该算法不需要假设资源需求的先验分布,在实际工作负载中具有较强的鲁棒性。它在三个方面优于其他现有方法:1)QoS收益直接由不同任务的差异化收益来建模,而不是使用SLA违反。2)对于地理分布式数据中心,考虑虚拟机迁移时间和网络通信成本。3)优化强化学习算法的信息存储和随机动作选择,实现快速决策。实验结果表明,本文提出的算法比现有算法具有更强的鲁棒性。此外,在非差异化和差异化业务中,我们的算法的功耗分别比现有算法高13.3%和9.6%左右。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Reinforcement learning-based adaptive resource management of differentiated services in geo-distributed data centers
For better service provision and utilization of renewable energy, Internet service providers have already built their data centers in geographically distributed locations. These companies balance quality of service (QoS) revenue and power consumption by migrating virtual machines (VMs) and allocating the resource of servers adaptively. However, existing approaches model the QoS revenue by service-level agreement (SLA) violation, and ignore the network communication cost and immigration time. In this paper, we propose a reinforcement learning-based adaptive resource management algorithm, which aims to get the balance between QoS revenue and power consumption. Our algorithm does not need to assume prior distribution of resource requirements, and is robust in actual workload. It outperforms other existing approaches in three aspects: 1) The QoS revenue is directly modeled by differentiated revenue of different tasks, instead of using SLA violation. 2) For geo-distributed data centers, the time spent on VM migration and network communication cost are taken into consideration. 3) The information storage and random action selection of reinforcement learning algorithms are optimized for rapid decision making. Experiments show that our proposed algorithm is more robust than the existing algorithms. Besides, the power consumption of our algorithm is around 13.3% and 9.6% better than the existing algorithms in non-differentiated and differentiated services.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
When privacy meets economics: Enabling differentially-private battery-supported meter reporting in smart grid Task assignment with guaranteed quality for crowdsourcing platforms Social media stickiness in Mobile Personal Livestreaming service Multicast scheduling algorithm in software defined fat-tree data center networks A cooperative mechanism for efficient inter-domain in-network cache sharing
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1