利用深度强化学习和负荷预测优化光伏电池系统

IF 9.6 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Energy and AI Pub Date : 2024-02-02 DOI:10.1016/j.egyai.2024.100347
António Corte Real , G. Pontes Luz , J.M.C. Sousa , M.C. Brito , S.M. Vieira
{"title":"利用深度强化学习和负荷预测优化光伏电池系统","authors":"António Corte Real ,&nbsp;G. Pontes Luz ,&nbsp;J.M.C. Sousa ,&nbsp;M.C. Brito ,&nbsp;S.M. Vieira","doi":"10.1016/j.egyai.2024.100347","DOIUrl":null,"url":null,"abstract":"<div><p>Home Energy Management Systems (HEMS) are increasingly relevant for demand-side management at the residential level by collecting data (energy, weather, electricity prices) and controlling home appliances or storage systems. This control can be performed with classical models that find optimal solutions, with high real-time computational cost, or data-driven approaches, like Reinforcement Learning, that find good and flexible solutions, but depend on the availability of load and generation data and demand high computational resources for training. In this work, a novel HEMS is proposed for the optimization of an electric battery operation in a real, online and data-driven environment that integrates state-of-the-art load forecasting combining CNN and LSTM neural networks to increase the robustness of decisions. Several Reinforcement Learning agents are trained with different algorithms (Double DQN, Dueling DQN, Rainbow and Proximal Policy Optimization) in order to minimize the cost of electricity purchase and to maximize photovoltaic self-consumption for a PV-Battery residential system. Results show that the best Reinforcement Learning agent achieves a 35% reduction in total cost when compared with an optimization-based agent.</p></div>","PeriodicalId":34138,"journal":{"name":"Energy and AI","volume":"16 ","pages":"Article 100347"},"PeriodicalIF":9.6000,"publicationDate":"2024-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2666546824000132/pdfft?md5=801e90a3cad6681c711e85effe347670&pid=1-s2.0-S2666546824000132-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Optimization of a photovoltaic-battery system using deep reinforcement learning and load forecasting\",\"authors\":\"António Corte Real ,&nbsp;G. Pontes Luz ,&nbsp;J.M.C. Sousa ,&nbsp;M.C. Brito ,&nbsp;S.M. Vieira\",\"doi\":\"10.1016/j.egyai.2024.100347\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Home Energy Management Systems (HEMS) are increasingly relevant for demand-side management at the residential level by collecting data (energy, weather, electricity prices) and controlling home appliances or storage systems. This control can be performed with classical models that find optimal solutions, with high real-time computational cost, or data-driven approaches, like Reinforcement Learning, that find good and flexible solutions, but depend on the availability of load and generation data and demand high computational resources for training. In this work, a novel HEMS is proposed for the optimization of an electric battery operation in a real, online and data-driven environment that integrates state-of-the-art load forecasting combining CNN and LSTM neural networks to increase the robustness of decisions. Several Reinforcement Learning agents are trained with different algorithms (Double DQN, Dueling DQN, Rainbow and Proximal Policy Optimization) in order to minimize the cost of electricity purchase and to maximize photovoltaic self-consumption for a PV-Battery residential system. Results show that the best Reinforcement Learning agent achieves a 35% reduction in total cost when compared with an optimization-based agent.</p></div>\",\"PeriodicalId\":34138,\"journal\":{\"name\":\"Energy and AI\",\"volume\":\"16 \",\"pages\":\"Article 100347\"},\"PeriodicalIF\":9.6000,\"publicationDate\":\"2024-02-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S2666546824000132/pdfft?md5=801e90a3cad6681c711e85effe347670&pid=1-s2.0-S2666546824000132-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Energy and AI\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2666546824000132\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Energy and AI","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2666546824000132","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

家庭能源管理系统(HEMS)通过收集数据(能源、天气、电价)和控制家用电器或储能系统,在住宅层面的需求侧管理中发挥着越来越重要的作用。这种控制可以采用传统模型,找到最优解,但实时计算成本较高;也可以采用数据驱动方法,如强化学习,找到良好而灵活的解决方案,但这取决于负载和发电数据的可用性,并且需要大量计算资源进行训练。在这项工作中,我们提出了一种新型 HEMS,用于在真实、在线和数据驱动的环境中优化蓄电池的运行,该系统集成了最先进的负荷预测技术,并结合了 CNN 和 LSTM 神经网络,以提高决策的鲁棒性。使用不同的算法(双 DQN、决斗 DQN、彩虹和近端策略优化)对多个强化学习代理进行了训练,以最小化购电成本,最大化光伏电池住宅系统的光伏自消耗。结果表明,与基于优化的代理相比,最佳强化学习代理的总成本降低了 35%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Optimization of a photovoltaic-battery system using deep reinforcement learning and load forecasting

Home Energy Management Systems (HEMS) are increasingly relevant for demand-side management at the residential level by collecting data (energy, weather, electricity prices) and controlling home appliances or storage systems. This control can be performed with classical models that find optimal solutions, with high real-time computational cost, or data-driven approaches, like Reinforcement Learning, that find good and flexible solutions, but depend on the availability of load and generation data and demand high computational resources for training. In this work, a novel HEMS is proposed for the optimization of an electric battery operation in a real, online and data-driven environment that integrates state-of-the-art load forecasting combining CNN and LSTM neural networks to increase the robustness of decisions. Several Reinforcement Learning agents are trained with different algorithms (Double DQN, Dueling DQN, Rainbow and Proximal Policy Optimization) in order to minimize the cost of electricity purchase and to maximize photovoltaic self-consumption for a PV-Battery residential system. Results show that the best Reinforcement Learning agent achieves a 35% reduction in total cost when compared with an optimization-based agent.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Energy and AI
Energy and AI Engineering-Engineering (miscellaneous)
CiteScore
16.50
自引率
0.00%
发文量
64
审稿时长
56 days
期刊最新文献
Predicting the thermal conductivity of polymer composites with one-dimensional oriented fillers using the combination of deep learning and ensemble learning A hybrid wind power prediction model based on seasonal feature decomposition and enhanced feature extraction Integrating local knowledge with ChatGPT-like large-scale language models for enhanced societal comprehension of carbon neutrality Optimization of a Bayesian game for Peer-to-Peer trading among prosumers under incomplete information via a CNN-LSTM-ATT Parameter sensitivity analysis for diesel spray penetration prediction based on GA-BP neural network
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1