{"title":"作为收缩映射的循环网络的收敛性","authors":"J. Steck","doi":"10.1109/IJCNN.1992.227131","DOIUrl":null,"url":null,"abstract":"Three theorems are presented which establish an upper bound on the magnitude of the weights which guarantees convergence of the network to a stable unique fixed point. It is shown that the bound on the weights is inversely proportional to the product of the number of neurons in the network and the maximum slope of the neuron activation functions. The location of its fixed point is determined by the network architecture, weights, and the external input values. The proofs are constructive, consisting of representing the network as a contraction mapping and then applying the contraction mapping theorem from point set topology. The resulting sufficient conditions for network stability are shown to be general enough to allow the network to have nontrivial fixed points.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":"{\"title\":\"Convergence of recurrent networks as contraction mappings\",\"authors\":\"J. Steck\",\"doi\":\"10.1109/IJCNN.1992.227131\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Three theorems are presented which establish an upper bound on the magnitude of the weights which guarantees convergence of the network to a stable unique fixed point. It is shown that the bound on the weights is inversely proportional to the product of the number of neurons in the network and the maximum slope of the neuron activation functions. The location of its fixed point is determined by the network architecture, weights, and the external input values. The proofs are constructive, consisting of representing the network as a contraction mapping and then applying the contraction mapping theorem from point set topology. The resulting sufficient conditions for network stability are shown to be general enough to allow the network to have nontrivial fixed points.<<ETX>>\",\"PeriodicalId\":286849,\"journal\":{\"name\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1992-06-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"20\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.1992.227131\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1992.227131","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Convergence of recurrent networks as contraction mappings
Three theorems are presented which establish an upper bound on the magnitude of the weights which guarantees convergence of the network to a stable unique fixed point. It is shown that the bound on the weights is inversely proportional to the product of the number of neurons in the network and the maximum slope of the neuron activation functions. The location of its fixed point is determined by the network architecture, weights, and the external input values. The proofs are constructive, consisting of representing the network as a contraction mapping and then applying the contraction mapping theorem from point set topology. The resulting sufficient conditions for network stability are shown to be general enough to allow the network to have nontrivial fixed points.<>