The wireless edge network is considered a promising technology for reducing latency by efficiently caching popular content in edge stations. However, optimizing caching alone is insufficient, as power allocation also strongly impacts system performance under dynamic requests and wireless conditions. Their interdependence motivates a joint optimization of caching and power allocation to maximize the benefit of edge networks. This paper proposes a joint caching and power allocation framework that integrates request prediction, utility-aware caching decisions, and adaptive power allocation to minimize user-perceived latency. Each edge station predicts future user requests using recent request history, and the predicted requests serve as caching candidates. Using alternating optimization, caching decisions and transmit power allocation are determined under system constraints. Our framework consists of four functional units: user request prediction, joint caching and power allocation optimization for edge stations, latent vector update, and user power allocation. The numerical results under various network conditions and constrained storage capacity resources show that the proposed framework outperforms other strategies and reduces latency up to 31.51% compared to the baseline strategy. Overall, the framework consistently reduces latency in varying network conditions, offering an effective solution for content delivery in edge networks.
扫码关注我们
求助内容:
应助结果提醒方式:
