基于深度强化学习的电动汽车需求响应建筑能源管理

Daeyoung Kang, Seunghyun Yoon, Hyuk-Soon Lim
{"title":"基于深度强化学习的电动汽车需求响应建筑能源管理","authors":"Daeyoung Kang, Seunghyun Yoon, Hyuk-Soon Lim","doi":"10.1109/ICAIIC57133.2023.10066975","DOIUrl":null,"url":null,"abstract":"In recent years, stability issues of power grids have become critical with the rapid increase in power consumption. Demand response (DR) is a policy that incentivizes consumers to reduce their power usage so that electricity demand does not exceed the supply of a power grid to prevent the power grid's instability. We propose a Deep Q-Network (DQN)-based building energy management system that reduces the amount of electricity supplied by electric power companies by utilizing the surplus power of electric vehicles (EVs) upon DR requests. The proposed scheme considers the DR incentives and penalties as well as the cost of buying energy from EVs. In addition, the amount of time used for discharging EVs is also taken into consideration in DQN's reward function. We perform the simulations to compare the proposed scheme with a random selection scheme and a greedy scheme to recruit the nearest EVs until the DR request is fulfilled. The simulation result indicates that the proposed scheme succeeds to balance the building cost and the EV waiting time performance at the EV stations.","PeriodicalId":105769,"journal":{"name":"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-02-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep Reinforcement Learning-based Building Energy Management using Electric Vehicles for Demand Response\",\"authors\":\"Daeyoung Kang, Seunghyun Yoon, Hyuk-Soon Lim\",\"doi\":\"10.1109/ICAIIC57133.2023.10066975\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In recent years, stability issues of power grids have become critical with the rapid increase in power consumption. Demand response (DR) is a policy that incentivizes consumers to reduce their power usage so that electricity demand does not exceed the supply of a power grid to prevent the power grid's instability. We propose a Deep Q-Network (DQN)-based building energy management system that reduces the amount of electricity supplied by electric power companies by utilizing the surplus power of electric vehicles (EVs) upon DR requests. The proposed scheme considers the DR incentives and penalties as well as the cost of buying energy from EVs. In addition, the amount of time used for discharging EVs is also taken into consideration in DQN's reward function. We perform the simulations to compare the proposed scheme with a random selection scheme and a greedy scheme to recruit the nearest EVs until the DR request is fulfilled. The simulation result indicates that the proposed scheme succeeds to balance the building cost and the EV waiting time performance at the EV stations.\",\"PeriodicalId\":105769,\"journal\":{\"name\":\"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-02-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAIIC57133.2023.10066975\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC57133.2023.10066975","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

近年来,随着电力消费的快速增长,电网的稳定性问题日益突出。需求响应(DR)是一种激励消费者减少用电量,使电力需求不超过电网供应,以防止电网不稳定的政策。我们提出了一个基于深度q网络(DQN)的建筑能源管理系统,该系统通过利用电动汽车(ev)在DR请求时的剩余电力来减少电力公司的供电量。该方案考虑了DR激励和处罚以及从电动汽车购买能源的成本。此外,DQN的奖励函数也考虑了电动汽车的放电时间。我们进行了仿真,将所提出的方案与随机选择方案和贪婪方案进行比较,以招募最近的电动汽车,直到DR请求得到满足。仿真结果表明,该方案成功地平衡了建设成本和电动汽车在电动汽车站的等待时间性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Deep Reinforcement Learning-based Building Energy Management using Electric Vehicles for Demand Response
In recent years, stability issues of power grids have become critical with the rapid increase in power consumption. Demand response (DR) is a policy that incentivizes consumers to reduce their power usage so that electricity demand does not exceed the supply of a power grid to prevent the power grid's instability. We propose a Deep Q-Network (DQN)-based building energy management system that reduces the amount of electricity supplied by electric power companies by utilizing the surplus power of electric vehicles (EVs) upon DR requests. The proposed scheme considers the DR incentives and penalties as well as the cost of buying energy from EVs. In addition, the amount of time used for discharging EVs is also taken into consideration in DQN's reward function. We perform the simulations to compare the proposed scheme with a random selection scheme and a greedy scheme to recruit the nearest EVs until the DR request is fulfilled. The simulation result indicates that the proposed scheme succeeds to balance the building cost and the EV waiting time performance at the EV stations.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Development of AI Educational Datasets Library Using Synthetic Dataset Generation Method Channel Access Control Instead of Random Backoff Algorithm Illegal 3D Content Distribution Tracking System based on DNN Forensic Watermarking Deep Learning-based Spectral Efficiency Maximization in Massive MIMO-NOMA Systems with STAR-RIS Data Pipeline Design for Dangerous Driving Behavior Detection System
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1