Huayue Zhang, Shuli Wen, Mingchang Gu, Miao Zhu, Huili Ye
{"title":"考虑电池状态估计的双层舰载实时能源管理强化学习方法","authors":"Huayue Zhang, Shuli Wen, Mingchang Gu, Miao Zhu, Huili Ye","doi":"10.1049/esi2.12157","DOIUrl":null,"url":null,"abstract":"<p>Increasing global environmental concerns encourage a continuous reduction in carbon emissions from the shipping industry. It has become an irreversible trend to replace traditional fossil fuels with advanced energy storage technology. However, an improper energy management leads to not only energy waste but also undesired costs and emissions. Accordingly, the authors develop a two-layer shipboard energy management framework. In the initial stage, a shipboard navigation planning problem is formulated that considers battery state estimation and is subsequently solved using particle swarm optimisation to obtain an optimal speed trajectory. To track the scheduled speed, a reinforcement learning method based on a deep Q-Network is proposed in the second stage to realise real-time energy management of the diesel generator and energy storage system. This approach ensures that the state of charge remains within a safe range and that the performance is improved, avoiding excessive discharge from the energy storage systems and further enhancing the efficiency. The numerical results demonstrate the necessity and effectiveness of the proposed method.</p>","PeriodicalId":33288,"journal":{"name":"IET Energy Systems Integration","volume":"6 3","pages":"333-343"},"PeriodicalIF":1.6000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/esi2.12157","citationCount":"0","resultStr":"{\"title\":\"A reinforcement learning method for two-layer shipboard real-time energy management considering battery state estimation\",\"authors\":\"Huayue Zhang, Shuli Wen, Mingchang Gu, Miao Zhu, Huili Ye\",\"doi\":\"10.1049/esi2.12157\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Increasing global environmental concerns encourage a continuous reduction in carbon emissions from the shipping industry. It has become an irreversible trend to replace traditional fossil fuels with advanced energy storage technology. However, an improper energy management leads to not only energy waste but also undesired costs and emissions. Accordingly, the authors develop a two-layer shipboard energy management framework. In the initial stage, a shipboard navigation planning problem is formulated that considers battery state estimation and is subsequently solved using particle swarm optimisation to obtain an optimal speed trajectory. To track the scheduled speed, a reinforcement learning method based on a deep Q-Network is proposed in the second stage to realise real-time energy management of the diesel generator and energy storage system. This approach ensures that the state of charge remains within a safe range and that the performance is improved, avoiding excessive discharge from the energy storage systems and further enhancing the efficiency. The numerical results demonstrate the necessity and effectiveness of the proposed method.</p>\",\"PeriodicalId\":33288,\"journal\":{\"name\":\"IET Energy Systems Integration\",\"volume\":\"6 3\",\"pages\":\"333-343\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1049/esi2.12157\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IET Energy Systems Integration\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1049/esi2.12157\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENERGY & FUELS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IET Energy Systems Integration","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/esi2.12157","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENERGY & FUELS","Score":null,"Total":0}
A reinforcement learning method for two-layer shipboard real-time energy management considering battery state estimation
Increasing global environmental concerns encourage a continuous reduction in carbon emissions from the shipping industry. It has become an irreversible trend to replace traditional fossil fuels with advanced energy storage technology. However, an improper energy management leads to not only energy waste but also undesired costs and emissions. Accordingly, the authors develop a two-layer shipboard energy management framework. In the initial stage, a shipboard navigation planning problem is formulated that considers battery state estimation and is subsequently solved using particle swarm optimisation to obtain an optimal speed trajectory. To track the scheduled speed, a reinforcement learning method based on a deep Q-Network is proposed in the second stage to realise real-time energy management of the diesel generator and energy storage system. This approach ensures that the state of charge remains within a safe range and that the performance is improved, avoiding excessive discharge from the energy storage systems and further enhancing the efficiency. The numerical results demonstrate the necessity and effectiveness of the proposed method.