{"title":"采用 Hebbian 学习的 Hopfield 网络中的原型分析","authors":"Hayden McAlister, Anthony Robins, Lech Szymanski","doi":"arxiv-2407.03342","DOIUrl":null,"url":null,"abstract":"We discuss prototype formation in the Hopfield network. Typically, Hebbian\nlearning with highly correlated states leads to degraded memory performance. We\nshow this type of learning can lead to prototype formation, where unlearned\nstates emerge as representatives of large correlated subsets of states,\nalleviating capacity woes. This process has similarities to prototype learning\nin human cognition. We provide a substantial literature review of prototype\nlearning in associative memories, covering contributions from psychology,\nstatistical physics, and computer science. We analyze prototype formation from\na theoretical perspective and derive a stability condition for these states\nbased on the number of examples of the prototype presented for learning, the\nnoise in those examples, and the number of non-example states presented. The\nstability condition is used to construct a probability of stability for a\nprototype state as the factors of stability change. We also note similarities\nto traditional network analysis, allowing us to find a prototype capacity. We\ncorroborate these expectations of prototype formation with experiments using a\nsimple Hopfield network with standard Hebbian learning. We extend our\nexperiments to a Hopfield network trained on data with multiple prototypes and\nfind the network is capable of stabilizing multiple prototypes concurrently. We\nmeasure the basins of attraction of the multiple prototype states, finding\nattractor strength grows with the number of examples and the agreement of\nexamples. We link the stability and dominance of prototype states to the energy\nprofile of these states, particularly when comparing the profile shape to\ntarget states or other spurious states.","PeriodicalId":501066,"journal":{"name":"arXiv - PHYS - Disordered Systems and Neural Networks","volume":"364 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Prototype Analysis in Hopfield Networks with Hebbian Learning\",\"authors\":\"Hayden McAlister, Anthony Robins, Lech Szymanski\",\"doi\":\"arxiv-2407.03342\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We discuss prototype formation in the Hopfield network. Typically, Hebbian\\nlearning with highly correlated states leads to degraded memory performance. We\\nshow this type of learning can lead to prototype formation, where unlearned\\nstates emerge as representatives of large correlated subsets of states,\\nalleviating capacity woes. This process has similarities to prototype learning\\nin human cognition. We provide a substantial literature review of prototype\\nlearning in associative memories, covering contributions from psychology,\\nstatistical physics, and computer science. We analyze prototype formation from\\na theoretical perspective and derive a stability condition for these states\\nbased on the number of examples of the prototype presented for learning, the\\nnoise in those examples, and the number of non-example states presented. The\\nstability condition is used to construct a probability of stability for a\\nprototype state as the factors of stability change. We also note similarities\\nto traditional network analysis, allowing us to find a prototype capacity. We\\ncorroborate these expectations of prototype formation with experiments using a\\nsimple Hopfield network with standard Hebbian learning. We extend our\\nexperiments to a Hopfield network trained on data with multiple prototypes and\\nfind the network is capable of stabilizing multiple prototypes concurrently. We\\nmeasure the basins of attraction of the multiple prototype states, finding\\nattractor strength grows with the number of examples and the agreement of\\nexamples. We link the stability and dominance of prototype states to the energy\\nprofile of these states, particularly when comparing the profile shape to\\ntarget states or other spurious states.\",\"PeriodicalId\":501066,\"journal\":{\"name\":\"arXiv - PHYS - Disordered Systems and Neural Networks\",\"volume\":\"364 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-05-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - PHYS - Disordered Systems and Neural Networks\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2407.03342\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Disordered Systems and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2407.03342","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Prototype Analysis in Hopfield Networks with Hebbian Learning
We discuss prototype formation in the Hopfield network. Typically, Hebbian
learning with highly correlated states leads to degraded memory performance. We
show this type of learning can lead to prototype formation, where unlearned
states emerge as representatives of large correlated subsets of states,
alleviating capacity woes. This process has similarities to prototype learning
in human cognition. We provide a substantial literature review of prototype
learning in associative memories, covering contributions from psychology,
statistical physics, and computer science. We analyze prototype formation from
a theoretical perspective and derive a stability condition for these states
based on the number of examples of the prototype presented for learning, the
noise in those examples, and the number of non-example states presented. The
stability condition is used to construct a probability of stability for a
prototype state as the factors of stability change. We also note similarities
to traditional network analysis, allowing us to find a prototype capacity. We
corroborate these expectations of prototype formation with experiments using a
simple Hopfield network with standard Hebbian learning. We extend our
experiments to a Hopfield network trained on data with multiple prototypes and
find the network is capable of stabilizing multiple prototypes concurrently. We
measure the basins of attraction of the multiple prototype states, finding
attractor strength grows with the number of examples and the agreement of
examples. We link the stability and dominance of prototype states to the energy
profile of these states, particularly when comparing the profile shape to
target states or other spurious states.