{"title":"Robust Reinforcement Learning for Decision Making Under Uncertainty in Electricity Markets","authors":"Dawei Qiu;Jianhong Wang;Guangchun Ruan;Qianzhi Zhang;Goran Strbac","doi":"10.1109/TPWRS.2024.3502639","DOIUrl":null,"url":null,"abstract":"Reinforcement learning (RL) is a powerful tool for market agents solving decision-making problems in electricity markets. Vanilla RL enables agents to learn optimal policies in dynamic and uncertain market environments via trial and error. However, uncertainties in state transitions are often treated as exogenous state features with statistical errors. This approach can result in policies that are sensitive to perturbations of these uncertainties, potentially leading to performance degradation. This sensitivity is particularly critical in electricity markets, where the penetration of renewable energy and demand variability are increasing. To address this issue, this paper proposes a robust adversarial RL algorithm aimed at learning a robust optimal policy that accounts for market uncertainties in state transitions to systematically mitigate sensitivity to perturbations in uncertain environments. Specifically, we leverage the uncertainty set regularizer technique to define uncertainty sets within the parametric space of state transitions. Furthermore, we introduce a novel adversarial approach to generate unknown uncertainty sets using the value function as a basis. We finally conduct a comprehensive assessment of the robust adversarial RL algorithm across three electricity market applications: strategic bidding, retail pricing, and peer-to-peer energy trading, demonstrating significant improvements in robustness performance against various uncertainties.","PeriodicalId":13373,"journal":{"name":"IEEE Transactions on Power Systems","volume":"40 3","pages":"2750-2763"},"PeriodicalIF":7.2000,"publicationDate":"2024-11-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Power Systems","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10759306/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Reinforcement learning (RL) is a powerful tool for market agents solving decision-making problems in electricity markets. Vanilla RL enables agents to learn optimal policies in dynamic and uncertain market environments via trial and error. However, uncertainties in state transitions are often treated as exogenous state features with statistical errors. This approach can result in policies that are sensitive to perturbations of these uncertainties, potentially leading to performance degradation. This sensitivity is particularly critical in electricity markets, where the penetration of renewable energy and demand variability are increasing. To address this issue, this paper proposes a robust adversarial RL algorithm aimed at learning a robust optimal policy that accounts for market uncertainties in state transitions to systematically mitigate sensitivity to perturbations in uncertain environments. Specifically, we leverage the uncertainty set regularizer technique to define uncertainty sets within the parametric space of state transitions. Furthermore, we introduce a novel adversarial approach to generate unknown uncertainty sets using the value function as a basis. We finally conduct a comprehensive assessment of the robust adversarial RL algorithm across three electricity market applications: strategic bidding, retail pricing, and peer-to-peer energy trading, demonstrating significant improvements in robustness performance against various uncertainties.
期刊介绍:
The scope of IEEE Transactions on Power Systems covers the education, analysis, operation, planning, and economics of electric generation, transmission, and distribution systems for general industrial, commercial, public, and domestic consumption, including the interaction with multi-energy carriers. The focus of this transactions is the power system from a systems viewpoint instead of components of the system. It has five (5) key areas within its scope with several technical topics within each area. These areas are: (1) Power Engineering Education, (2) Power System Analysis, Computing, and Economics, (3) Power System Dynamic Performance, (4) Power System Operations, and (5) Power System Planning and Implementation.