{"title":"Inherently Interpretable Physics-Informed Neural Network for Battery Modeling and Prognosis.","authors":"Fujin Wang, Quanquan Zhi, Zhibin Zhao, Zhi Zhai, Yingkai Liu, Huan Xi, Shibin Wang, Xuefeng Chen","doi":"10.1109/TNNLS.2023.3329368","DOIUrl":null,"url":null,"abstract":"<p><p>Lithium-ion batteries are widely used in modern society. Accurate modeling and prognosis are fundamental to achieving reliable operation of lithium-ion batteries. Accurately predicting the end-of-discharge (EOD) is critical for operations and decision-making when they are deployed to critical missions. Existing data-driven methods have large model parameters, which require a large amount of labeled data and the models are not interpretable. Model-based methods need to know many parameters related to battery design, and the models are difficult to solve. To bridge these gaps, this study proposes a physics-informed neural network (PINN), called battery neural network (BattNN), for battery modeling and prognosis. Specifically, we propose to design the structure of BattNN based on the equivalent circuit model (ECM). Therefore, the entire BattNN is completely constrained by physics. Its forward propagation process follows the physical laws, and the model is inherently interpretable. To validate the proposed method, we conduct the discharge experiments under random loading profiles and develop our dataset. Analysis and experiments show that the proposed BattNN only needs approximately 30 samples for training, and the average required training time is 21.5 s. Experimental results on three datasets show that our method can achieve high prediction accuracy with only a few learnable parameters. Compared with other neural networks, the prediction MAEs of our BattNN are reduced by 77.1%, 67.4%, and 75.0% on three datasets, respectively. Our data and code will be available at: https://github.com/wang-fujin/BattNN.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2023-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2023.3329368","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Lithium-ion batteries are widely used in modern society. Accurate modeling and prognosis are fundamental to achieving reliable operation of lithium-ion batteries. Accurately predicting the end-of-discharge (EOD) is critical for operations and decision-making when they are deployed to critical missions. Existing data-driven methods have large model parameters, which require a large amount of labeled data and the models are not interpretable. Model-based methods need to know many parameters related to battery design, and the models are difficult to solve. To bridge these gaps, this study proposes a physics-informed neural network (PINN), called battery neural network (BattNN), for battery modeling and prognosis. Specifically, we propose to design the structure of BattNN based on the equivalent circuit model (ECM). Therefore, the entire BattNN is completely constrained by physics. Its forward propagation process follows the physical laws, and the model is inherently interpretable. To validate the proposed method, we conduct the discharge experiments under random loading profiles and develop our dataset. Analysis and experiments show that the proposed BattNN only needs approximately 30 samples for training, and the average required training time is 21.5 s. Experimental results on three datasets show that our method can achieve high prediction accuracy with only a few learnable parameters. Compared with other neural networks, the prediction MAEs of our BattNN are reduced by 77.1%, 67.4%, and 75.0% on three datasets, respectively. Our data and code will be available at: https://github.com/wang-fujin/BattNN.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.