{"title":"Improving error tolerance of self-organizing neural nets","authors":"F. Sha, Q. Gan","doi":"10.1109/IJCNN.1991.170279","DOIUrl":null,"url":null,"abstract":"A hybrid neural net (HNN) combining the network introduced by G.A. Carpenter and S. Grossberg (1987, 1988) and the Hopfield associative memory (HAM) is developed. HAM diminishes noise in samples and provides ART1 samples as inputs. In order to match the capacity of HAM with that of ART1, a new recalling algorithm (NHAM) is also introduced to enlarge the capacity of HAM. Based on NHAM and HNN, a revised version of HNN (RHNN) is introduced. The difference between RHNN and HNN is that RHNN has feedback loops, while HNN has only feedforward paths. The ART1 in RHNN supplies information for HAM to recall memories. Computer simulation demonstrated that RHNN has several advantages.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.1991.170279","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
A hybrid neural net (HNN) combining the network introduced by G.A. Carpenter and S. Grossberg (1987, 1988) and the Hopfield associative memory (HAM) is developed. HAM diminishes noise in samples and provides ART1 samples as inputs. In order to match the capacity of HAM with that of ART1, a new recalling algorithm (NHAM) is also introduced to enlarge the capacity of HAM. Based on NHAM and HNN, a revised version of HNN (RHNN) is introduced. The difference between RHNN and HNN is that RHNN has feedback loops, while HNN has only feedforward paths. The ART1 in RHNN supplies information for HAM to recall memories. Computer simulation demonstrated that RHNN has several advantages.<>