Learning in Associative Networks Through Pavlovian Dynamics.

IF 2.7 4区 计算机科学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Neural Computation Pub Date : 2025-01-21 DOI:10.1162/neco_a_01730
Daniele Lotito, Miriam Aquaro, Chiara Marullo
{"title":"Learning in Associative Networks Through Pavlovian Dynamics.","authors":"Daniele Lotito, Miriam Aquaro, Chiara Marullo","doi":"10.1162/neco_a_01730","DOIUrl":null,"url":null,"abstract":"<p><p>Hebbian learning theory is rooted in Pavlov's classical conditioning While mathematical models of the former have been proposed and studied in the past decades, especially in spin glass theory, only recently has it been numerically shown that it is possible to write neural and synaptic dynamics that mirror Pavlov conditioning mechanisms and also give rise to synaptic weights that correspond to the Hebbian learning rule. In this article we show that the same dynamics can be derived with equilibrium statistical mechanics tools and basic and motivated modeling assumptions. Then we show how to study the resulting system of coupled stochastic differential equations assuming the reasonable separation of neural and synaptic timescale. In particular, we analytically demonstrate that this synaptic evolution converges to the Hebbian learning rule in various settings and compute the variance of the stochastic process. Finally, drawing from evidence on pure memory reinforcement during sleep stages, we show how the proposed model can simulate neural networks that undergo sleep-associated memory consolidation processes, thereby proving the compatibility of Pavlovian learning with dreaming mechanisms.</p>","PeriodicalId":54731,"journal":{"name":"Neural Computation","volume":" ","pages":"311-343"},"PeriodicalIF":2.7000,"publicationDate":"2025-01-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computation","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1162/neco_a_01730","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Hebbian learning theory is rooted in Pavlov's classical conditioning While mathematical models of the former have been proposed and studied in the past decades, especially in spin glass theory, only recently has it been numerically shown that it is possible to write neural and synaptic dynamics that mirror Pavlov conditioning mechanisms and also give rise to synaptic weights that correspond to the Hebbian learning rule. In this article we show that the same dynamics can be derived with equilibrium statistical mechanics tools and basic and motivated modeling assumptions. Then we show how to study the resulting system of coupled stochastic differential equations assuming the reasonable separation of neural and synaptic timescale. In particular, we analytically demonstrate that this synaptic evolution converges to the Hebbian learning rule in various settings and compute the variance of the stochastic process. Finally, drawing from evidence on pure memory reinforcement during sleep stages, we show how the proposed model can simulate neural networks that undergo sleep-associated memory consolidation processes, thereby proving the compatibility of Pavlovian learning with dreaming mechanisms.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
巴甫洛夫动力学在联想网络中的学习。
Hebbian学习理论源于巴甫洛夫的经典条件作用,虽然前者的数学模型在过去的几十年里已经被提出和研究,特别是在自旋玻璃理论中,但直到最近才有数字表明,有可能写出反映巴甫洛夫条件作用机制的神经和突触动力学,并产生与Hebbian学习规则相对应的突触权重。在这封信中,我们表明,同样的动力学可以导出与平衡统计力学工具和基本的和激励的建模假设。然后,我们展示了如何研究耦合随机微分方程的结果系统,假设神经和突触时间尺度的合理分离。特别是,我们分析证明了这种突触进化在各种设置下收敛于Hebbian学习规则,并计算了随机过程的方差。最后,从睡眠阶段纯记忆强化的证据中,我们展示了所提出的模型如何模拟经历睡眠相关记忆巩固过程的神经网络,从而证明了巴甫洛夫学习与做梦机制的兼容性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Neural Computation
Neural Computation 工程技术-计算机:人工智能
CiteScore
6.30
自引率
3.40%
发文量
83
审稿时长
3.0 months
期刊介绍: Neural Computation is uniquely positioned at the crossroads between neuroscience and TMCS and welcomes the submission of original papers from all areas of TMCS, including: Advanced experimental design; Analysis of chemical sensor data; Connectomic reconstructions; Analysis of multielectrode and optical recordings; Genetic data for cell identity; Analysis of behavioral data; Multiscale models; Analysis of molecular mechanisms; Neuroinformatics; Analysis of brain imaging data; Neuromorphic engineering; Principles of neural coding, computation, circuit dynamics, and plasticity; Theories of brain function.
期刊最新文献
Generalization Guarantees of Gradient Descent for Shallow Neural Networks. Generalization Analysis of Transformers in Distribution Regression. A Fast Algorithm for the Real-Valued Combinatorial Pure Exploration of the Multi-Armed Bandit. Learning in Associative Networks Through Pavlovian Dynamics. On the Compressive Power of Autoencoders With Linear and ReLU Activation Functions.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1