{"title":"Composing and Solving General Differential Equations Using Extended Polynomial Networks","authors":"L. Zjavka, V. Snás̃el","doi":"10.1109/INCoS.2015.28","DOIUrl":null,"url":null,"abstract":"Multi-variable data relations can define a partial differential equation, which describes an unknown complex function on a basis of discrete observations, using the similarity model analysis methods. Time-series can form an ordinary differential equation, which is analogously possible to replace by partial derivatives of the same type time-dependent observations. Polynomial neural networks can compose and solve an unknown general partial differential equation of a searched function or pattern model by means of low order composite multi-variable derivative fractions. Convergent sum series of relative terms, produced by polynomial networks, describe partial dependent derivative changes of some polynomial combinations of input variables and can substitute for the general differential equation. This non-linear regression type is based on learned generalized partial elementary data relations, decomposed into a polynomial network derivative structure, which is able to define and create more complex and varied indirect model forms than standard soft computing techniques allow. The sigmoidal function, commonly used as an activation function in artificial neurons, may improve the polynomials and substituting derivative term series abilities to approximate complicated periodic multi-variable or time-series functions and model a system behaviour.","PeriodicalId":345650,"journal":{"name":"2015 International Conference on Intelligent Networking and Collaborative Systems","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-09-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference on Intelligent Networking and Collaborative Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INCoS.2015.28","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Multi-variable data relations can define a partial differential equation, which describes an unknown complex function on a basis of discrete observations, using the similarity model analysis methods. Time-series can form an ordinary differential equation, which is analogously possible to replace by partial derivatives of the same type time-dependent observations. Polynomial neural networks can compose and solve an unknown general partial differential equation of a searched function or pattern model by means of low order composite multi-variable derivative fractions. Convergent sum series of relative terms, produced by polynomial networks, describe partial dependent derivative changes of some polynomial combinations of input variables and can substitute for the general differential equation. This non-linear regression type is based on learned generalized partial elementary data relations, decomposed into a polynomial network derivative structure, which is able to define and create more complex and varied indirect model forms than standard soft computing techniques allow. The sigmoidal function, commonly used as an activation function in artificial neurons, may improve the polynomials and substituting derivative term series abilities to approximate complicated periodic multi-variable or time-series functions and model a system behaviour.