{"title":"神经网络非线性动力学中稀疏循环连接和输入的重建。","authors":"Victor J Barranca","doi":"10.1007/s10827-022-00831-x","DOIUrl":null,"url":null,"abstract":"<p><p>Reconstructing the recurrent structural connectivity of neuronal networks is a challenge crucial to address in characterizing neuronal computations. While directly measuring the detailed connectivity structure is generally prohibitive for large networks, we develop a novel framework for reverse-engineering large-scale recurrent network connectivity matrices from neuronal dynamics by utilizing the widespread sparsity of neuronal connections. We derive a linear input-output mapping that underlies the irregular dynamics of a model network composed of both excitatory and inhibitory integrate-and-fire neurons with pulse coupling, thereby relating network inputs to evoked neuronal activity. Using this embedded mapping and experimentally feasible measurements of the firing rate as well as voltage dynamics in response to a relatively small ensemble of random input stimuli, we efficiently reconstruct the recurrent network connectivity via compressive sensing techniques. Through analogous analysis, we then recover high dimensional natural stimuli from evoked neuronal network dynamics over a short time horizon. This work provides a generalizable methodology for rapidly recovering sparse neuronal network data and underlines the natural role of sparsity in facilitating the efficient encoding of network data in neuronal dynamics.</p>","PeriodicalId":54857,"journal":{"name":"Journal of Computational Neuroscience","volume":"51 1","pages":"43-58"},"PeriodicalIF":1.5000,"publicationDate":"2023-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Reconstruction of sparse recurrent connectivity and inputs from the nonlinear dynamics of neuronal networks.\",\"authors\":\"Victor J Barranca\",\"doi\":\"10.1007/s10827-022-00831-x\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Reconstructing the recurrent structural connectivity of neuronal networks is a challenge crucial to address in characterizing neuronal computations. While directly measuring the detailed connectivity structure is generally prohibitive for large networks, we develop a novel framework for reverse-engineering large-scale recurrent network connectivity matrices from neuronal dynamics by utilizing the widespread sparsity of neuronal connections. We derive a linear input-output mapping that underlies the irregular dynamics of a model network composed of both excitatory and inhibitory integrate-and-fire neurons with pulse coupling, thereby relating network inputs to evoked neuronal activity. Using this embedded mapping and experimentally feasible measurements of the firing rate as well as voltage dynamics in response to a relatively small ensemble of random input stimuli, we efficiently reconstruct the recurrent network connectivity via compressive sensing techniques. Through analogous analysis, we then recover high dimensional natural stimuli from evoked neuronal network dynamics over a short time horizon. This work provides a generalizable methodology for rapidly recovering sparse neuronal network data and underlines the natural role of sparsity in facilitating the efficient encoding of network data in neuronal dynamics.</p>\",\"PeriodicalId\":54857,\"journal\":{\"name\":\"Journal of Computational Neuroscience\",\"volume\":\"51 1\",\"pages\":\"43-58\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2023-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Computational Neuroscience\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://doi.org/10.1007/s10827-022-00831-x\",\"RegionNum\":4,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"MATHEMATICAL & COMPUTATIONAL BIOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1007/s10827-022-00831-x","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
Reconstruction of sparse recurrent connectivity and inputs from the nonlinear dynamics of neuronal networks.
Reconstructing the recurrent structural connectivity of neuronal networks is a challenge crucial to address in characterizing neuronal computations. While directly measuring the detailed connectivity structure is generally prohibitive for large networks, we develop a novel framework for reverse-engineering large-scale recurrent network connectivity matrices from neuronal dynamics by utilizing the widespread sparsity of neuronal connections. We derive a linear input-output mapping that underlies the irregular dynamics of a model network composed of both excitatory and inhibitory integrate-and-fire neurons with pulse coupling, thereby relating network inputs to evoked neuronal activity. Using this embedded mapping and experimentally feasible measurements of the firing rate as well as voltage dynamics in response to a relatively small ensemble of random input stimuli, we efficiently reconstruct the recurrent network connectivity via compressive sensing techniques. Through analogous analysis, we then recover high dimensional natural stimuli from evoked neuronal network dynamics over a short time horizon. This work provides a generalizable methodology for rapidly recovering sparse neuronal network data and underlines the natural role of sparsity in facilitating the efficient encoding of network data in neuronal dynamics.
期刊介绍:
The Journal of Computational Neuroscience provides a forum for papers that fit the interface between computational and experimental work in the neurosciences. The Journal of Computational Neuroscience publishes full length original papers, rapid communications and review articles describing theoretical and experimental work relevant to computations in the brain and nervous system. Papers that combine theoretical and experimental work are especially encouraged. Primarily theoretical papers should deal with issues of obvious relevance to biological nervous systems. Experimental papers should have implications for the computational function of the nervous system, and may report results using any of a variety of approaches including anatomy, electrophysiology, biophysics, imaging, and molecular biology. Papers investigating the physiological mechanisms underlying pathologies of the nervous system, or papers that report novel technologies of interest to researchers in computational neuroscience, including advances in neural data analysis methods yielding insights into the function of the nervous system, are also welcomed (in this case, methodological papers should include an application of the new method, exemplifying the insights that it yields).It is anticipated that all levels of analysis from cognitive to cellular will be represented in the Journal of Computational Neuroscience.