Xu Zhou , Jing Yang , Yijun Li , Shaobo Li , Zhidong Su
{"title":"基于深度强化学习的资源调度,在 SDN 驱动的边缘计算中实现能源优化和负载平衡","authors":"Xu Zhou , Jing Yang , Yijun Li , Shaobo Li , Zhidong Su","doi":"10.1016/j.comcom.2024.107925","DOIUrl":null,"url":null,"abstract":"<div><p>Traditional techniques for edge computing resource scheduling may result in large amounts of wasted server resources and energy consumption; thus, exploring new approaches to achieve higher resource and energy efficiency is a new challenge. Deep reinforcement learning (DRL) offers a promising solution by balancing resource utilization, latency, and energy optimization. However, current methods often focus solely on energy optimization for offloading and computing tasks, neglecting the impact of server numbers and resource operation status on energy efficiency and load balancing. On the other hand, prioritizing latency optimization may result in resource imbalance and increased energy waste. To address these challenges, we propose a novel energy optimization method coupled with a load balancing strategy. Our approach aims to minimize overall energy consumption and achieve server load balancing under latency constraints. This is achieved by controlling the number of active servers and individual server load states through a two stage DRL-based energy and resource optimization algorithm. Experimental results demonstrate that our scheme can save an average of 19.84% energy compared to mainstream reinforcement learning methods and 49.60% and 45.33% compared to Round Robin (RR) and random scheduling, respectively. Additionally, our method is optimized for reward value, load balancing, runtime, and anti-interference capability.</p></div>","PeriodicalId":55224,"journal":{"name":"Computer Communications","volume":"226 ","pages":"Article 107925"},"PeriodicalIF":4.5000,"publicationDate":"2024-08-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep reinforcement learning-based resource scheduling for energy optimization and load balancing in SDN-driven edge computing\",\"authors\":\"Xu Zhou , Jing Yang , Yijun Li , Shaobo Li , Zhidong Su\",\"doi\":\"10.1016/j.comcom.2024.107925\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Traditional techniques for edge computing resource scheduling may result in large amounts of wasted server resources and energy consumption; thus, exploring new approaches to achieve higher resource and energy efficiency is a new challenge. Deep reinforcement learning (DRL) offers a promising solution by balancing resource utilization, latency, and energy optimization. However, current methods often focus solely on energy optimization for offloading and computing tasks, neglecting the impact of server numbers and resource operation status on energy efficiency and load balancing. On the other hand, prioritizing latency optimization may result in resource imbalance and increased energy waste. To address these challenges, we propose a novel energy optimization method coupled with a load balancing strategy. Our approach aims to minimize overall energy consumption and achieve server load balancing under latency constraints. This is achieved by controlling the number of active servers and individual server load states through a two stage DRL-based energy and resource optimization algorithm. Experimental results demonstrate that our scheme can save an average of 19.84% energy compared to mainstream reinforcement learning methods and 49.60% and 45.33% compared to Round Robin (RR) and random scheduling, respectively. Additionally, our method is optimized for reward value, load balancing, runtime, and anti-interference capability.</p></div>\",\"PeriodicalId\":55224,\"journal\":{\"name\":\"Computer Communications\",\"volume\":\"226 \",\"pages\":\"Article 107925\"},\"PeriodicalIF\":4.5000,\"publicationDate\":\"2024-08-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computer Communications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0140366424002640\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Communications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0140366424002640","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Deep reinforcement learning-based resource scheduling for energy optimization and load balancing in SDN-driven edge computing
Traditional techniques for edge computing resource scheduling may result in large amounts of wasted server resources and energy consumption; thus, exploring new approaches to achieve higher resource and energy efficiency is a new challenge. Deep reinforcement learning (DRL) offers a promising solution by balancing resource utilization, latency, and energy optimization. However, current methods often focus solely on energy optimization for offloading and computing tasks, neglecting the impact of server numbers and resource operation status on energy efficiency and load balancing. On the other hand, prioritizing latency optimization may result in resource imbalance and increased energy waste. To address these challenges, we propose a novel energy optimization method coupled with a load balancing strategy. Our approach aims to minimize overall energy consumption and achieve server load balancing under latency constraints. This is achieved by controlling the number of active servers and individual server load states through a two stage DRL-based energy and resource optimization algorithm. Experimental results demonstrate that our scheme can save an average of 19.84% energy compared to mainstream reinforcement learning methods and 49.60% and 45.33% compared to Round Robin (RR) and random scheduling, respectively. Additionally, our method is optimized for reward value, load balancing, runtime, and anti-interference capability.
期刊介绍:
Computer and Communications networks are key infrastructures of the information society with high socio-economic value as they contribute to the correct operations of many critical services (from healthcare to finance and transportation). Internet is the core of today''s computer-communication infrastructures. This has transformed the Internet, from a robust network for data transfer between computers, to a global, content-rich, communication and information system where contents are increasingly generated by the users, and distributed according to human social relations. Next-generation network technologies, architectures and protocols are therefore required to overcome the limitations of the legacy Internet and add new capabilities and services. The future Internet should be ubiquitous, secure, resilient, and closer to human communication paradigms.
Computer Communications is a peer-reviewed international journal that publishes high-quality scientific articles (both theory and practice) and survey papers covering all aspects of future computer communication networks (on all layers, except the physical layer), with a special attention to the evolution of the Internet architecture, protocols, services, and applications.