{"title":"Non-exchangeable networks of integrate-and-fire neurons: spatially-extended mean-field limit of the empirical measure","authors":"Pierre-Emmanuel Jabin, Valentin Schmutz, Datong Zhou","doi":"arxiv-2409.06325","DOIUrl":null,"url":null,"abstract":"The dynamics of exchangeable or spatially-structured networks of $N$\ninteracting stochastic neurons can be described by deterministic population\nequations in the mean-field limit $N\\to\\infty$, when synaptic weights scale as\n$O(1/N)$. This asymptotic behavior has been proven in several works but a\ngeneral question has remained unanswered: does the $O(1/N)$ scaling of synaptic\nweights, by itself, suffice to guarantee the convergence of network dynamics to\na deterministic population equation, even when networks are not assumed to be\nexchangeable or spatially structured? In this work, we consider networks of\nstochastic integrate-and-fire neurons with arbitrary synaptic weights\nsatisfying only a $O(1/N)$ scaling condition. Borrowing results from the theory\nof dense graph limits (graphons), we prove that, as $N\\to\\infty$, and up to the\nextraction of a subsequence, the empirical measure of the neurons' membrane\npotentials converges to the solution of a spatially-extended mean-field partial\ndifferential equation (PDE). Our proof requires analytical techniques that go\nbeyond standard propagation of chaos methods. In particular, we introduce a\nweak metric that depends on the dense graph limit kernel and we show how the\nweak convergence of the initial data can be obtained by propagating the\nregularity of the limit kernel along the dual-backward equation associated with\nthe spatially-extended mean-field PDE. Overall, this result invites us to\nre-interpret spatially-extended population equations as universal mean-field\nlimits of networks of neurons with $O(1/N)$ synaptic weight scaling.","PeriodicalId":501517,"journal":{"name":"arXiv - QuanBio - Neurons and Cognition","volume":"5 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06325","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The dynamics of exchangeable or spatially-structured networks of $N$
interacting stochastic neurons can be described by deterministic population
equations in the mean-field limit $N\to\infty$, when synaptic weights scale as
$O(1/N)$. This asymptotic behavior has been proven in several works but a
general question has remained unanswered: does the $O(1/N)$ scaling of synaptic
weights, by itself, suffice to guarantee the convergence of network dynamics to
a deterministic population equation, even when networks are not assumed to be
exchangeable or spatially structured? In this work, we consider networks of
stochastic integrate-and-fire neurons with arbitrary synaptic weights
satisfying only a $O(1/N)$ scaling condition. Borrowing results from the theory
of dense graph limits (graphons), we prove that, as $N\to\infty$, and up to the
extraction of a subsequence, the empirical measure of the neurons' membrane
potentials converges to the solution of a spatially-extended mean-field partial
differential equation (PDE). Our proof requires analytical techniques that go
beyond standard propagation of chaos methods. In particular, we introduce a
weak metric that depends on the dense graph limit kernel and we show how the
weak convergence of the initial data can be obtained by propagating the
regularity of the limit kernel along the dual-backward equation associated with
the spatially-extended mean-field PDE. Overall, this result invites us to
re-interpret spatially-extended population equations as universal mean-field
limits of networks of neurons with $O(1/N)$ synaptic weight scaling.