{"title":"Deep Reinforcement Learning-Based Computation Offloading for Mobile Edge Computing in 6G","authors":"Haifeng Sun;Jiawei Wang;Dongping Yong;Mingwei Qin;Ning Zhang","doi":"10.1109/TCE.2024.3436824","DOIUrl":null,"url":null,"abstract":"The impending 6G network is envisioned to seamlessly interconnect a myriad of consumer electronics (CEs), facilitating a wide array of applications accessible from any location and at any time. To advance this objective, our paper proposes the integration of Mobile Edge Computing (MEC) with a multi-rotor Unmanned Aerial Vehicle (UAV), aiming to furnish computation offloading services for CEs of Ground Devices (GDs). Additionally, charging stations (CSs) are utilized to wirelessly charge the UAVs. Our objective is to minimize the UAV’s energy consumption for the entire mission by jointly optimizing both resource allocation and the UAV’s trajectory simultaneously. This entails solving a mixed-integer nonlinear programming (MINLP) optimization problem. Initially, we decompose the UAV’s trajectory into discrete offloading and charging locations, guided by a decision matrix. we decompose the optimization problem into two sub-problems. The first one determines offloading locations and resource allocation using Particle Swarm Optimization (PSO). The second one optimizes the decision matrix by incorporating PSO outputs and employing Double Deep Q-Network (DDQN), a form of deep reinforcement learning. Simulation results demonstrate that the proposed solution significantly reduces energy consumption compared to baseline schemes.","PeriodicalId":13208,"journal":{"name":"IEEE Transactions on Consumer Electronics","volume":"70 4","pages":"7482-7493"},"PeriodicalIF":10.9000,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Consumer Electronics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10620315/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
The impending 6G network is envisioned to seamlessly interconnect a myriad of consumer electronics (CEs), facilitating a wide array of applications accessible from any location and at any time. To advance this objective, our paper proposes the integration of Mobile Edge Computing (MEC) with a multi-rotor Unmanned Aerial Vehicle (UAV), aiming to furnish computation offloading services for CEs of Ground Devices (GDs). Additionally, charging stations (CSs) are utilized to wirelessly charge the UAVs. Our objective is to minimize the UAV’s energy consumption for the entire mission by jointly optimizing both resource allocation and the UAV’s trajectory simultaneously. This entails solving a mixed-integer nonlinear programming (MINLP) optimization problem. Initially, we decompose the UAV’s trajectory into discrete offloading and charging locations, guided by a decision matrix. we decompose the optimization problem into two sub-problems. The first one determines offloading locations and resource allocation using Particle Swarm Optimization (PSO). The second one optimizes the decision matrix by incorporating PSO outputs and employing Double Deep Q-Network (DDQN), a form of deep reinforcement learning. Simulation results demonstrate that the proposed solution significantly reduces energy consumption compared to baseline schemes.
期刊介绍:
The main focus for the IEEE Transactions on Consumer Electronics is the engineering and research aspects of the theory, design, construction, manufacture or end use of mass market electronics, systems, software and services for consumers.