J. W. Howse, C. Abdallah, G. Heileman, M. Georgiopoulos
{"title":"An application of gradient-like dynamics to neural networks","authors":"J. W. Howse, C. Abdallah, G. Heileman, M. Georgiopoulos","doi":"10.1109/SOUTHC.1994.498081","DOIUrl":null,"url":null,"abstract":"This paper reviews a formalism that enables the dynamics of a broad class of neural networks to be understood. This formalism is then applied to a specific network and the predicted and simulated behavior of the system are compared. The purpose of this work is to utilise a model of the dynamics that also describes the phase space behavior and structural stability of the system. This is achieved by writing the general equations of the neural network dynamics as a gradient-like system. In this paper it is demonstrated that a network with additive activation dynamics and Hebbian weight update dynamics can be expressed as a gradient-like system. An example of an S-layer network with feedback between adjacent layers is presented. It is shown that the process of weight learning is stable in this network when the learned weights are symmetric. Furthermore, the weight learning process is stable when the learned weights are asymmetric, provided that the activation is computed using only the symmetric part of the weights.","PeriodicalId":164672,"journal":{"name":"Conference Record Southcon","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-03-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference Record Southcon","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SOUTHC.1994.498081","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper reviews a formalism that enables the dynamics of a broad class of neural networks to be understood. This formalism is then applied to a specific network and the predicted and simulated behavior of the system are compared. The purpose of this work is to utilise a model of the dynamics that also describes the phase space behavior and structural stability of the system. This is achieved by writing the general equations of the neural network dynamics as a gradient-like system. In this paper it is demonstrated that a network with additive activation dynamics and Hebbian weight update dynamics can be expressed as a gradient-like system. An example of an S-layer network with feedback between adjacent layers is presented. It is shown that the process of weight learning is stable in this network when the learned weights are symmetric. Furthermore, the weight learning process is stable when the learned weights are asymmetric, provided that the activation is computed using only the symmetric part of the weights.