Non-exchangeable networks of integrate-and-fire neurons: spatially-extended mean-field limit of the empirical measure

Pierre-Emmanuel Jabin, Valentin Schmutz, Datong Zhou
{"title":"Non-exchangeable networks of integrate-and-fire neurons: spatially-extended mean-field limit of the empirical measure","authors":"Pierre-Emmanuel Jabin, Valentin Schmutz, Datong Zhou","doi":"arxiv-2409.06325","DOIUrl":null,"url":null,"abstract":"The dynamics of exchangeable or spatially-structured networks of $N$\ninteracting stochastic neurons can be described by deterministic population\nequations in the mean-field limit $N\\to\\infty$, when synaptic weights scale as\n$O(1/N)$. This asymptotic behavior has been proven in several works but a\ngeneral question has remained unanswered: does the $O(1/N)$ scaling of synaptic\nweights, by itself, suffice to guarantee the convergence of network dynamics to\na deterministic population equation, even when networks are not assumed to be\nexchangeable or spatially structured? In this work, we consider networks of\nstochastic integrate-and-fire neurons with arbitrary synaptic weights\nsatisfying only a $O(1/N)$ scaling condition. Borrowing results from the theory\nof dense graph limits (graphons), we prove that, as $N\\to\\infty$, and up to the\nextraction of a subsequence, the empirical measure of the neurons' membrane\npotentials converges to the solution of a spatially-extended mean-field partial\ndifferential equation (PDE). Our proof requires analytical techniques that go\nbeyond standard propagation of chaos methods. In particular, we introduce a\nweak metric that depends on the dense graph limit kernel and we show how the\nweak convergence of the initial data can be obtained by propagating the\nregularity of the limit kernel along the dual-backward equation associated with\nthe spatially-extended mean-field PDE. Overall, this result invites us to\nre-interpret spatially-extended population equations as universal mean-field\nlimits of networks of neurons with $O(1/N)$ synaptic weight scaling.","PeriodicalId":501517,"journal":{"name":"arXiv - QuanBio - Neurons and Cognition","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06325","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The dynamics of exchangeable or spatially-structured networks of $N$ interacting stochastic neurons can be described by deterministic population equations in the mean-field limit $N\to\infty$, when synaptic weights scale as $O(1/N)$. This asymptotic behavior has been proven in several works but a general question has remained unanswered: does the $O(1/N)$ scaling of synaptic weights, by itself, suffice to guarantee the convergence of network dynamics to a deterministic population equation, even when networks are not assumed to be exchangeable or spatially structured? In this work, we consider networks of stochastic integrate-and-fire neurons with arbitrary synaptic weights satisfying only a $O(1/N)$ scaling condition. Borrowing results from the theory of dense graph limits (graphons), we prove that, as $N\to\infty$, and up to the extraction of a subsequence, the empirical measure of the neurons' membrane potentials converges to the solution of a spatially-extended mean-field partial differential equation (PDE). Our proof requires analytical techniques that go beyond standard propagation of chaos methods. In particular, we introduce a weak metric that depends on the dense graph limit kernel and we show how the weak convergence of the initial data can be obtained by propagating the regularity of the limit kernel along the dual-backward equation associated with the spatially-extended mean-field PDE. Overall, this result invites us to re-interpret spatially-extended population equations as universal mean-field limits of networks of neurons with $O(1/N)$ synaptic weight scaling.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
整合-发射神经元的非交换网络:经验测量的空间扩展平均场极限
当突触权重缩放为 $O(1/N)$时,由 $N$ 相互作用的随机神经元组成的可交换或空间结构网络的动力学可以用均值场极限 $N\to\infty$ 中的确定性种群方程来描述。这一渐近行为已在一些著作中得到证明,但一个普遍的问题仍未得到解答:即使不假设网络是可交换的或空间结构的,突触权重的 $O(1/N)$ 缩放本身是否足以保证网络动力学收敛于确定性种群方程?在这项工作中,我们考虑了具有任意突触权重的随机积分-发射神经元网络,它们只满足 $O(1/N)$ 的缩放条件。借用密集图极限(graphons)理论的结果,我们证明当 $N\to\infty$ 时,直到子序列的抽取,神经元膜电位的经验度量会收敛到空间扩展均场偏微分方程(PDE)的解。我们的证明需要超越标准混沌传播方法的分析技术。特别是,我们引入了依赖于密集图极限核的弱度量,并展示了如何通过沿着与空间扩展均值场偏微分方程相关的对偶后向方程传播极限核的奇异性来获得初始数据的弱收敛性。总之,这一结果使我们能够将空间扩展的群体方程重新解释为具有 $O(1/N)$ 突触权重缩放的神经元网络的普遍均场极限。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Early reduced dopaminergic tone mediated by D3 receptor and dopamine transporter in absence epileptogenesis Contrasformer: A Brain Network Contrastive Transformer for Neurodegenerative Condition Identification Identifying Influential nodes in Brain Networks via Self-Supervised Graph-Transformer Contrastive Learning in Memristor-based Neuromorphic Systems Self-Attention Limits Working Memory Capacity of Transformer-Based Models
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1