{"title":"Cost-Effective Power Delivery via Deep Reinforcement Learning-Based Dynamic Electric Vehicle Transportation","authors":"Zheng Bao;Changbing Tang;Xinghuo Yu;Feilong Lin;Guanghui Wen;Zhonglong Zheng","doi":"10.1109/JIOT.2025.3552823","DOIUrl":null,"url":null,"abstract":"Power delivery issues are increasingly evident in cyber-physical smart grid systems as energy transactions frequently overlook the physical constraints of distribution, leading to transmission congestion and compromising network security and reliability. This article presents a novel and cost-effective solution to power delivery challenges by utilizing electric vehicles (EVs) with dynamic transportation capabilities as free carriers. Unlike traditional approaches, a deep reinforcement learning (DRL)-based optimization framework is designed to effectively manage incomplete information in real-time. Our method first introduces an investment-free model that leverages existing EV routes to transport energy during congestion, operating in a “free-riding” transmission mode. This not only enhances network reliability but also curtails costs. Then, we develop a Markov decision process (MDP) for sequential decision-making of 24-h optimal control, aimed at minimizing operational losses including load shedding and battery degradation. To deal with the stochastic nature of energy requests and EV routes in the control problem, we employ a model-free DRL algorithm to tackle the challenge of incomplete information. An Actor-Critic network, combining value-based and policy-based approaches, helps discover approximately optimal strategies in a continuous action space. Finally, the simulation results numerically demonstrate the performance of the proposed method.","PeriodicalId":54347,"journal":{"name":"IEEE Internet of Things Journal","volume":"12 13","pages":"23245-23256"},"PeriodicalIF":8.9000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Internet of Things Journal","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10934055/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Power delivery issues are increasingly evident in cyber-physical smart grid systems as energy transactions frequently overlook the physical constraints of distribution, leading to transmission congestion and compromising network security and reliability. This article presents a novel and cost-effective solution to power delivery challenges by utilizing electric vehicles (EVs) with dynamic transportation capabilities as free carriers. Unlike traditional approaches, a deep reinforcement learning (DRL)-based optimization framework is designed to effectively manage incomplete information in real-time. Our method first introduces an investment-free model that leverages existing EV routes to transport energy during congestion, operating in a “free-riding” transmission mode. This not only enhances network reliability but also curtails costs. Then, we develop a Markov decision process (MDP) for sequential decision-making of 24-h optimal control, aimed at minimizing operational losses including load shedding and battery degradation. To deal with the stochastic nature of energy requests and EV routes in the control problem, we employ a model-free DRL algorithm to tackle the challenge of incomplete information. An Actor-Critic network, combining value-based and policy-based approaches, helps discover approximately optimal strategies in a continuous action space. Finally, the simulation results numerically demonstrate the performance of the proposed method.
期刊介绍:
The EEE Internet of Things (IoT) Journal publishes articles and review articles covering various aspects of IoT, including IoT system architecture, IoT enabling technologies, IoT communication and networking protocols such as network coding, and IoT services and applications. Topics encompass IoT's impacts on sensor technologies, big data management, and future internet design for applications like smart cities and smart homes. Fields of interest include IoT architecture such as things-centric, data-centric, service-oriented IoT architecture; IoT enabling technologies and systematic integration such as sensor technologies, big sensor data management, and future Internet design for IoT; IoT services, applications, and test-beds such as IoT service middleware, IoT application programming interface (API), IoT application design, and IoT trials/experiments; IoT standardization activities and technology development in different standard development organizations (SDO) such as IEEE, IETF, ITU, 3GPP, ETSI, etc.