{"title":"Adaptive Prioritization and Task Offloading in Vehicular Edge Computing Through Deep Reinforcement Learning","authors":"Ashab Uddin;Ahmed Hamdi Sakr;Ning Zhang","doi":"10.1109/TVT.2024.3499962","DOIUrl":null,"url":null,"abstract":"Vehicular edge computing enables real-time decision-making by offloading vehicular computation tasks to edge servers along roadways. This paper focuses on optimizing offloading and scheduling these tasks, with an emphasis on task prioritization to maximize task completion within deadlines while minimizing latency and energy consumption across all priority levels. We propose a prioritized Deep Q-Network (DQNP) that optimizes long-term rewards through a priority-scaled reward system for each priority level, guiding the deep reinforcement learning (DRL) agent to select optimal actions. The model dynamically adjusts task selection based on environmental conditions, such as prioritizing tasks with higher deadlines in poor channel states, ensuring balanced and efficient offloading across all priority levels. Simulation results demonstrate that DQNP outperforms existing baseline algorithms, increasing task completion by 14%, particularly for high-priority tasks, while reducing energy consumption by 8% and maintaining similar latency. Additionally, the model mitigates resource starvation for lower-priority tasks, achieving task selection rates of 27%, 32%, and 42% for low-, medium-, and high-priority tasks, with completion ratios of 88%, 87%, and 86%, respectively, reflecting balanced resource allocation across priority classes.","PeriodicalId":13421,"journal":{"name":"IEEE Transactions on Vehicular Technology","volume":"74 3","pages":"5038-5052"},"PeriodicalIF":6.1000,"publicationDate":"2024-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Vehicular Technology","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10755183/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Vehicular edge computing enables real-time decision-making by offloading vehicular computation tasks to edge servers along roadways. This paper focuses on optimizing offloading and scheduling these tasks, with an emphasis on task prioritization to maximize task completion within deadlines while minimizing latency and energy consumption across all priority levels. We propose a prioritized Deep Q-Network (DQNP) that optimizes long-term rewards through a priority-scaled reward system for each priority level, guiding the deep reinforcement learning (DRL) agent to select optimal actions. The model dynamically adjusts task selection based on environmental conditions, such as prioritizing tasks with higher deadlines in poor channel states, ensuring balanced and efficient offloading across all priority levels. Simulation results demonstrate that DQNP outperforms existing baseline algorithms, increasing task completion by 14%, particularly for high-priority tasks, while reducing energy consumption by 8% and maintaining similar latency. Additionally, the model mitigates resource starvation for lower-priority tasks, achieving task selection rates of 27%, 32%, and 42% for low-, medium-, and high-priority tasks, with completion ratios of 88%, 87%, and 86%, respectively, reflecting balanced resource allocation across priority classes.
期刊介绍:
The scope of the Transactions is threefold (which was approved by the IEEE Periodicals Committee in 1967) and is published on the journal website as follows: Communications: The use of mobile radio on land, sea, and air, including cellular radio, two-way radio, and one-way radio, with applications to dispatch and control vehicles, mobile radiotelephone, radio paging, and status monitoring and reporting. Related areas include spectrum usage, component radio equipment such as cavities and antennas, compute control for radio systems, digital modulation and transmission techniques, mobile radio circuit design, radio propagation for vehicular communications, effects of ignition noise and radio frequency interference, and consideration of the vehicle as part of the radio operating environment. Transportation Systems: The use of electronic technology for the control of ground transportation systems including, but not limited to, traffic aid systems; traffic control systems; automatic vehicle identification, location, and monitoring systems; automated transport systems, with single and multiple vehicle control; and moving walkways or people-movers. Vehicular Electronics: The use of electronic or electrical components and systems for control, propulsion, or auxiliary functions, including but not limited to, electronic controls for engineer, drive train, convenience, safety, and other vehicle systems; sensors, actuators, and microprocessors for onboard use; electronic fuel control systems; vehicle electrical components and systems collision avoidance systems; electromagnetic compatibility in the vehicle environment; and electric vehicles and controls.