Pengkai Wang , Song Chen , Jiaxu Liu , Shengze Cai , Chao Xu
{"title":"PIDNODEs:受比例-积分-派生控制器启发的神经常微分方程","authors":"Pengkai Wang , Song Chen , Jiaxu Liu , Shengze Cai , Chao Xu","doi":"10.1016/j.neucom.2024.128769","DOIUrl":null,"url":null,"abstract":"<div><div>Neural Ordinary Differential Equations (NODEs) are a novel family of infinite-depth neural-net models through solving ODEs and their adjoint equations. In this paper, we present a strategy to enhance the training and inference of NODEs by integrating a Proportional–Integral–Derivative (PID) controller into the framework of Heavy Ball NODE, resulting in the proposed PIDNODEs and its generalized version, GPIDNODEs. By leveraging the advantages of control, PIDNODEs and GPIDNODEs can address the stiff ODE challenges by adjusting the parameters (i.e., <span><math><msub><mrow><mi>K</mi></mrow><mrow><mi>p</mi></mrow></msub></math></span>, <span><math><msub><mrow><mi>K</mi></mrow><mrow><mi>i</mi></mrow></msub></math></span> and <span><math><msub><mrow><mi>K</mi></mrow><mrow><mi>d</mi></mrow></msub></math></span>) in the PID module. The experiments confirm the superiority of PIDNODEs/GPIDNODEs over other NODE baselines on different computer vision and pattern recognition tasks, including image classification, point cloud separation and learning long-term dependencies from irregular time-series data for a physical dynamic system. These experiments demonstrate that the proposed models have higher accuracy and fewer function evaluations while alleviating the dilemma of exploding and vanishing gradients, particularly when learning long-term dependencies from a large amount of data.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"614 ","pages":"Article 128769"},"PeriodicalIF":5.5000,"publicationDate":"2024-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"PIDNODEs: Neural ordinary differential equations inspired by a proportional–integral–derivative controller\",\"authors\":\"Pengkai Wang , Song Chen , Jiaxu Liu , Shengze Cai , Chao Xu\",\"doi\":\"10.1016/j.neucom.2024.128769\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Neural Ordinary Differential Equations (NODEs) are a novel family of infinite-depth neural-net models through solving ODEs and their adjoint equations. In this paper, we present a strategy to enhance the training and inference of NODEs by integrating a Proportional–Integral–Derivative (PID) controller into the framework of Heavy Ball NODE, resulting in the proposed PIDNODEs and its generalized version, GPIDNODEs. By leveraging the advantages of control, PIDNODEs and GPIDNODEs can address the stiff ODE challenges by adjusting the parameters (i.e., <span><math><msub><mrow><mi>K</mi></mrow><mrow><mi>p</mi></mrow></msub></math></span>, <span><math><msub><mrow><mi>K</mi></mrow><mrow><mi>i</mi></mrow></msub></math></span> and <span><math><msub><mrow><mi>K</mi></mrow><mrow><mi>d</mi></mrow></msub></math></span>) in the PID module. The experiments confirm the superiority of PIDNODEs/GPIDNODEs over other NODE baselines on different computer vision and pattern recognition tasks, including image classification, point cloud separation and learning long-term dependencies from irregular time-series data for a physical dynamic system. These experiments demonstrate that the proposed models have higher accuracy and fewer function evaluations while alleviating the dilemma of exploding and vanishing gradients, particularly when learning long-term dependencies from a large amount of data.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"614 \",\"pages\":\"Article 128769\"},\"PeriodicalIF\":5.5000,\"publicationDate\":\"2024-10-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231224015406\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224015406","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
PIDNODEs: Neural ordinary differential equations inspired by a proportional–integral–derivative controller
Neural Ordinary Differential Equations (NODEs) are a novel family of infinite-depth neural-net models through solving ODEs and their adjoint equations. In this paper, we present a strategy to enhance the training and inference of NODEs by integrating a Proportional–Integral–Derivative (PID) controller into the framework of Heavy Ball NODE, resulting in the proposed PIDNODEs and its generalized version, GPIDNODEs. By leveraging the advantages of control, PIDNODEs and GPIDNODEs can address the stiff ODE challenges by adjusting the parameters (i.e., , and ) in the PID module. The experiments confirm the superiority of PIDNODEs/GPIDNODEs over other NODE baselines on different computer vision and pattern recognition tasks, including image classification, point cloud separation and learning long-term dependencies from irregular time-series data for a physical dynamic system. These experiments demonstrate that the proposed models have higher accuracy and fewer function evaluations while alleviating the dilemma of exploding and vanishing gradients, particularly when learning long-term dependencies from a large amount of data.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.