{"title":"反向传播作为人工神经网络训练中微分代数方程的解","authors":"J. Sanchez-Gasca, D. Klapper, J. Yoshizawa","doi":"10.1109/ANN.1991.213470","DOIUrl":null,"url":null,"abstract":"The backpropagation algorithm for neural network training is formulated as the solution of a set of sparse differential algebraic equations (DAE). These equations are then solved as a function of time. The solution of the differential equations is performed using an implicit integrator with adjustable time step. The topology of the Jacobian matrix associated with the DAE's is illustrated. A training example is included.<<ETX>>","PeriodicalId":119713,"journal":{"name":"Proceedings of the First International Forum on Applications of Neural Networks to Power Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1991-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Back-propagation as the solution of differential-algebraic equations for artificial neural network training\",\"authors\":\"J. Sanchez-Gasca, D. Klapper, J. Yoshizawa\",\"doi\":\"10.1109/ANN.1991.213470\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The backpropagation algorithm for neural network training is formulated as the solution of a set of sparse differential algebraic equations (DAE). These equations are then solved as a function of time. The solution of the differential equations is performed using an implicit integrator with adjustable time step. The topology of the Jacobian matrix associated with the DAE's is illustrated. A training example is included.<<ETX>>\",\"PeriodicalId\":119713,\"journal\":{\"name\":\"Proceedings of the First International Forum on Applications of Neural Networks to Power Systems\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1991-07-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the First International Forum on Applications of Neural Networks to Power Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ANN.1991.213470\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the First International Forum on Applications of Neural Networks to Power Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ANN.1991.213470","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Back-propagation as the solution of differential-algebraic equations for artificial neural network training
The backpropagation algorithm for neural network training is formulated as the solution of a set of sparse differential algebraic equations (DAE). These equations are then solved as a function of time. The solution of the differential equations is performed using an implicit integrator with adjustable time step. The topology of the Jacobian matrix associated with the DAE's is illustrated. A training example is included.<>