{"title":"时变激活函数的递归神经网络稳定性分析","authors":"M. Mostafa, W. Teich, J. Lindner","doi":"10.1109/INDS.2011.6024816","DOIUrl":null,"url":null,"abstract":"The dynamical behavior of a single layer recurrent neural network without hidden neurons has been investigated intensively and its stability has been analyzed using the Lyapunov method. Since the pioneering work of Hopfield many modified versions of the original Hopfield network have been suggested and their stability has been proven. In this paper we generalize these results to the case of a time-varying activation function, which is very useful in the field of parameter estimation and communications.","PeriodicalId":117809,"journal":{"name":"Proceedings of the Joint INDS'11 & ISTET'11","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":"{\"title\":\"Stability analysis of recurrent neural networks with time-varying activation functions\",\"authors\":\"M. Mostafa, W. Teich, J. Lindner\",\"doi\":\"10.1109/INDS.2011.6024816\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The dynamical behavior of a single layer recurrent neural network without hidden neurons has been investigated intensively and its stability has been analyzed using the Lyapunov method. Since the pioneering work of Hopfield many modified versions of the original Hopfield network have been suggested and their stability has been proven. In this paper we generalize these results to the case of a time-varying activation function, which is very useful in the field of parameter estimation and communications.\",\"PeriodicalId\":117809,\"journal\":{\"name\":\"Proceedings of the Joint INDS'11 & ISTET'11\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-07-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"7\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Joint INDS'11 & ISTET'11\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/INDS.2011.6024816\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Joint INDS'11 & ISTET'11","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INDS.2011.6024816","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Stability analysis of recurrent neural networks with time-varying activation functions
The dynamical behavior of a single layer recurrent neural network without hidden neurons has been investigated intensively and its stability has been analyzed using the Lyapunov method. Since the pioneering work of Hopfield many modified versions of the original Hopfield network have been suggested and their stability has been proven. In this paper we generalize these results to the case of a time-varying activation function, which is very useful in the field of parameter estimation and communications.