The rapid expansion of 5G and the upcoming arrival of 6G have significantly increased the demand for cloud computing resources, especially in edge cloud servers, to meet stringent connectivity and latency requirements. This surge has raised serious energy concerns as data centers now account for about 1–1.5% of global energy consumption and contribute about 1% of global CO emissions. In response to these facts, this study proposes a novel energy-aware machine learning model, using power sensor data from physical machines (PMs) in data centers, to optimize energy consumption while managing container placement as a use case.
We conducted experiments in a testbed using realistic 5G traffic scenarios, deliberately avoiding artificial stressors such as stress-ng, which create synthetic loads that do not accurately reflect real-world resource utilization. Our machine learning model, particularly the XGBoost implementation, proved to be highly effective, achieving an score of 91.2%. The model demonstrated the ability to reduce energy consumption by 3% and improve task completion times, all without the need for explicit consolidation strategies or cluster reconfiguration.
This approach highlights the power of machine learning in optimizing energy efficiency in dynamic and resource-intensive environments such as edge cloud servers, providing a scalable solution for data centers facing increasing energy demands.
扫码关注我们
求助内容:
应助结果提醒方式:
