{"title":"A learning method for recurrent networks based on minimization of finite automata","authors":"Itsuki Noda, Makoto Nagao","doi":"10.1109/IJCNN.1992.287211","DOIUrl":null,"url":null,"abstract":"A novel network model and a learning algorithm based on symbol processing theory are described. The algorithm is derived from the minimization method of finite automata under the correspondence between Elman networks and finite automata. An attempt was made to learn context-free grammars by the new model network. Even though this learning method was derived under the correspondence to finite automata, the network can learn the subgrammar, which is the important feature for distinguishing context-free grammars and finite state automata.<<ETX>>","PeriodicalId":286849,"journal":{"name":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","volume":"32 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1992-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings 1992] IJCNN International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1992.287211","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13
Abstract
A novel network model and a learning algorithm based on symbol processing theory are described. The algorithm is derived from the minimization method of finite automata under the correspondence between Elman networks and finite automata. An attempt was made to learn context-free grammars by the new model network. Even though this learning method was derived under the correspondence to finite automata, the network can learn the subgrammar, which is the important feature for distinguishing context-free grammars and finite state automata.<>