{"title":"通过带噪数据训练改进Hopfield网络","authors":"F. Clift, T. Martinez","doi":"10.1109/IJCNN.2001.939521","DOIUrl":null,"url":null,"abstract":"An approach to training a generalized Hopfield network is developed and evaluated. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with backpropagation through time, using noisy versions of the memorized patterns. Training in this way is referred to as noisy associative training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or pseudo-inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with pseudoinverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or pseudo-inverse training-an average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.","PeriodicalId":346955,"journal":{"name":"IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Improved Hopfield networks by training with noisy data\",\"authors\":\"F. Clift, T. Martinez\",\"doi\":\"10.1109/IJCNN.2001.939521\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An approach to training a generalized Hopfield network is developed and evaluated. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with backpropagation through time, using noisy versions of the memorized patterns. Training in this way is referred to as noisy associative training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or pseudo-inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with pseudoinverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or pseudo-inverse training-an average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.\",\"PeriodicalId\":346955,\"journal\":{\"name\":\"IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222)\",\"volume\":\"3 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2001-07-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN.2001.939521\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IJCNN'01. International Joint Conference on Neural Networks. Proceedings (Cat. No.01CH37222)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2001.939521","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Improved Hopfield networks by training with noisy data
An approach to training a generalized Hopfield network is developed and evaluated. Both the weight symmetricity constraint and the zero self-connection constraint are removed from standard Hopfield networks. Training is accomplished with backpropagation through time, using noisy versions of the memorized patterns. Training in this way is referred to as noisy associative training (NAT). Performance of NAT is evaluated on both random and correlated data. NAT has been tested on several data sets, with a large number of training runs for each experiment. The data sets used include uniformly distributed random data and several data sets adapted from the U.C. Irvine Machine Learning Repository. Results show that for random patterns, Hopfield networks trained with NAT have an average overall recall accuracy 6.1 times greater than networks produced with either Hebbian or pseudo-inverse training. Additionally, these networks have 13% fewer spurious memories on average than networks trained with pseudoinverse or Hebbian training. Typically, networks memorizing over 2N (where N is the number of nodes in the network) patterns are produced. Performance on correlated data shows an even greater improvement over networks produced with either Hebbian or pseudo-inverse training-an average of 27.8 times greater recall accuracy, with 14% fewer spurious memories.