{"title":"基于n步自举的微电网储能管理","authors":"Necati Aksoy;Istemihan Genc","doi":"10.1109/ICJECE.2022.3232213","DOIUrl":null,"url":null,"abstract":"Microgrids offer superiorities such as reducing energy costs and increasing the quality of energy, with the use of renewable energy sources and the effective use of energy storage unit created with innovative batteries. Furthermore, this structure, which helps to reduce the carbon footprint, will become undeniably critical to use in near future with the nanogrid and smart grid. As another development, an artificial intelligence (AI)-based control infrastructure brought to us by machine learning stands out as more beneficial than classical control methods. With this framework, which is called reinforcement learning (RL), it is promised that the system to be controlled can be more efficient. At this point, the thrifty use of energy storage unit, which is the most important tool that will increase the profitability of microgrids and enhance the proficiency of energy use, is associated with an RL-based energy control system. While this study focuses on an AI-based control infrastructure, it proposes a method utilizing an RL agent trained with a novel environmental model proposed specifically for the energy storage unit of microgrids. The advantages of this method demonstrated with the results are obtained, are shown and examined.","PeriodicalId":100619,"journal":{"name":"IEEE Canadian Journal of Electrical and Computer Engineering","volume":"46 2","pages":"107-116"},"PeriodicalIF":2.1000,"publicationDate":"2023-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Energy Storage Management for Microgrids Using n-Step Bootstrapping\",\"authors\":\"Necati Aksoy;Istemihan Genc\",\"doi\":\"10.1109/ICJECE.2022.3232213\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Microgrids offer superiorities such as reducing energy costs and increasing the quality of energy, with the use of renewable energy sources and the effective use of energy storage unit created with innovative batteries. Furthermore, this structure, which helps to reduce the carbon footprint, will become undeniably critical to use in near future with the nanogrid and smart grid. As another development, an artificial intelligence (AI)-based control infrastructure brought to us by machine learning stands out as more beneficial than classical control methods. With this framework, which is called reinforcement learning (RL), it is promised that the system to be controlled can be more efficient. At this point, the thrifty use of energy storage unit, which is the most important tool that will increase the profitability of microgrids and enhance the proficiency of energy use, is associated with an RL-based energy control system. While this study focuses on an AI-based control infrastructure, it proposes a method utilizing an RL agent trained with a novel environmental model proposed specifically for the energy storage unit of microgrids. The advantages of this method demonstrated with the results are obtained, are shown and examined.\",\"PeriodicalId\":100619,\"journal\":{\"name\":\"IEEE Canadian Journal of Electrical and Computer Engineering\",\"volume\":\"46 2\",\"pages\":\"107-116\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2023-03-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Canadian Journal of Electrical and Computer Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10126129/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Canadian Journal of Electrical and Computer Engineering","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10126129/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, HARDWARE & ARCHITECTURE","Score":null,"Total":0}
Energy Storage Management for Microgrids Using n-Step Bootstrapping
Microgrids offer superiorities such as reducing energy costs and increasing the quality of energy, with the use of renewable energy sources and the effective use of energy storage unit created with innovative batteries. Furthermore, this structure, which helps to reduce the carbon footprint, will become undeniably critical to use in near future with the nanogrid and smart grid. As another development, an artificial intelligence (AI)-based control infrastructure brought to us by machine learning stands out as more beneficial than classical control methods. With this framework, which is called reinforcement learning (RL), it is promised that the system to be controlled can be more efficient. At this point, the thrifty use of energy storage unit, which is the most important tool that will increase the profitability of microgrids and enhance the proficiency of energy use, is associated with an RL-based energy control system. While this study focuses on an AI-based control infrastructure, it proposes a method utilizing an RL agent trained with a novel environmental model proposed specifically for the energy storage unit of microgrids. The advantages of this method demonstrated with the results are obtained, are shown and examined.