{"title":"基于深度强化学习的电动汽车智能电网集成","authors":"Farkhondeh Kiaee","doi":"10.1109/IKT51791.2020.9345625","DOIUrl":null,"url":null,"abstract":"The vehicle-to-grid (V2G) technology provides an opportunity to generate revenue by selling electricity back to the grid at peak times when electricity is more expensive. Instead of sharing a contaminated pump handle at a gas station during the current covid-19 pandemic, plugging in the electric vehicle (EV) at home makes feel much safer. A V2G control algorithm is necessary to decide whether the electric vehicle (EV) should be charged or discharged in each hour. In this paper, we study the real-time V2G control problem under price uncertainty where the electricity price is determined dynamically every hour. Our model is inspired by the Deep Q-learning (DQN) algorithm which combines popular Q-learning with a deep neural network. The proposed Double-DQN model is an update of the DQN which maintains two distinct networks to select or evaluate an action. The Double-DQN algorithm is used to control charge/discharge operation in the hourly available electricity price in order to maximize the profit for the EV owner during the whole parking time. Experiment results show that our proposed method can work effectively in the real electricity market and it is able to increase the profit significantly compared with the other state-of-the-art EV charging schemes.","PeriodicalId":382725,"journal":{"name":"2020 11th International Conference on Information and Knowledge Technology (IKT)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Integration of Electric Vehicles in Smart Grid using Deep Reinforcement Learning\",\"authors\":\"Farkhondeh Kiaee\",\"doi\":\"10.1109/IKT51791.2020.9345625\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The vehicle-to-grid (V2G) technology provides an opportunity to generate revenue by selling electricity back to the grid at peak times when electricity is more expensive. Instead of sharing a contaminated pump handle at a gas station during the current covid-19 pandemic, plugging in the electric vehicle (EV) at home makes feel much safer. A V2G control algorithm is necessary to decide whether the electric vehicle (EV) should be charged or discharged in each hour. In this paper, we study the real-time V2G control problem under price uncertainty where the electricity price is determined dynamically every hour. Our model is inspired by the Deep Q-learning (DQN) algorithm which combines popular Q-learning with a deep neural network. The proposed Double-DQN model is an update of the DQN which maintains two distinct networks to select or evaluate an action. The Double-DQN algorithm is used to control charge/discharge operation in the hourly available electricity price in order to maximize the profit for the EV owner during the whole parking time. Experiment results show that our proposed method can work effectively in the real electricity market and it is able to increase the profit significantly compared with the other state-of-the-art EV charging schemes.\",\"PeriodicalId\":382725,\"journal\":{\"name\":\"2020 11th International Conference on Information and Knowledge Technology (IKT)\",\"volume\":\"117 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 11th International Conference on Information and Knowledge Technology (IKT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IKT51791.2020.9345625\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 11th International Conference on Information and Knowledge Technology (IKT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IKT51791.2020.9345625","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Integration of Electric Vehicles in Smart Grid using Deep Reinforcement Learning
The vehicle-to-grid (V2G) technology provides an opportunity to generate revenue by selling electricity back to the grid at peak times when electricity is more expensive. Instead of sharing a contaminated pump handle at a gas station during the current covid-19 pandemic, plugging in the electric vehicle (EV) at home makes feel much safer. A V2G control algorithm is necessary to decide whether the electric vehicle (EV) should be charged or discharged in each hour. In this paper, we study the real-time V2G control problem under price uncertainty where the electricity price is determined dynamically every hour. Our model is inspired by the Deep Q-learning (DQN) algorithm which combines popular Q-learning with a deep neural network. The proposed Double-DQN model is an update of the DQN which maintains two distinct networks to select or evaluate an action. The Double-DQN algorithm is used to control charge/discharge operation in the hourly available electricity price in order to maximize the profit for the EV owner during the whole parking time. Experiment results show that our proposed method can work effectively in the real electricity market and it is able to increase the profit significantly compared with the other state-of-the-art EV charging schemes.