{"title":"The connectivity degree controls the difficulty in reservoir design of random boolean networks","authors":"Emmanuel Calvet, Bertrand Reulet, Jean Rouat","doi":"10.3389/fncom.2024.1348138","DOIUrl":null,"url":null,"abstract":"<p>Reservoir Computing (RC) is a paradigm in artificial intelligence where a recurrent neural network (RNN) is used to process temporal data, leveraging the inherent dynamical properties of the reservoir to perform complex computations. In the realm of RC, the excitatory-inhibitory balance <italic>b</italic> has been shown to be pivotal for driving the dynamics and performance of Echo State Networks (ESN) and, more recently, Random Boolean Network (RBN). However, the relationship between <italic>b</italic> and other parameters of the network is still poorly understood. This article explores how the interplay of the balance <italic>b</italic>, the connectivity degree <italic>K</italic> (i.e., the number of synapses per neuron) and the size of the network (i.e., the number of neurons <italic>N</italic>) influences the dynamics and performance (memory and prediction) of an RBN reservoir. Our findings reveal that <italic>K</italic> and <italic>b</italic> are strongly tied in optimal reservoirs. Reservoirs with high <italic>K</italic> have two optimal balances, one for globally inhibitory networks (<italic>b</italic> < 0), and the other one for excitatory networks (<italic>b</italic> > 0). Both show asymmetric performances about a zero balance. In contrast, for moderate <italic>K</italic>, the optimal value being <italic>K</italic> = 4, best reservoirs are obtained when excitation and inhibition almost, but not exactly, balance each other. For almost all <italic>K</italic>, the influence of the size is such that increasing <italic>N</italic> leads to better performance, even with very large values of <italic>N</italic>. Our investigation provides clear directions to generate optimal reservoirs or reservoirs with constraints on size or connectivity.</p>","PeriodicalId":12363,"journal":{"name":"Frontiers in Computational Neuroscience","volume":null,"pages":null},"PeriodicalIF":2.1000,"publicationDate":"2024-02-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Computational Neuroscience","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.3389/fncom.2024.1348138","RegionNum":4,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATHEMATICAL & COMPUTATIONAL BIOLOGY","Score":null,"Total":0}
引用次数: 0
Abstract
Reservoir Computing (RC) is a paradigm in artificial intelligence where a recurrent neural network (RNN) is used to process temporal data, leveraging the inherent dynamical properties of the reservoir to perform complex computations. In the realm of RC, the excitatory-inhibitory balance b has been shown to be pivotal for driving the dynamics and performance of Echo State Networks (ESN) and, more recently, Random Boolean Network (RBN). However, the relationship between b and other parameters of the network is still poorly understood. This article explores how the interplay of the balance b, the connectivity degree K (i.e., the number of synapses per neuron) and the size of the network (i.e., the number of neurons N) influences the dynamics and performance (memory and prediction) of an RBN reservoir. Our findings reveal that K and b are strongly tied in optimal reservoirs. Reservoirs with high K have two optimal balances, one for globally inhibitory networks (b < 0), and the other one for excitatory networks (b > 0). Both show asymmetric performances about a zero balance. In contrast, for moderate K, the optimal value being K = 4, best reservoirs are obtained when excitation and inhibition almost, but not exactly, balance each other. For almost all K, the influence of the size is such that increasing N leads to better performance, even with very large values of N. Our investigation provides clear directions to generate optimal reservoirs or reservoirs with constraints on size or connectivity.
蓄水池计算(Reservoir Computing,RC)是人工智能领域的一种范式,利用蓄水池固有的动态特性来执行复杂的计算,从而使用循环神经网络(RNN)来处理时间数据。在 RC 领域,兴奋-抑制平衡 b 已被证明是驱动回声状态网络(ESN)以及最近的随机布尔网络(RBN)的动态和性能的关键。然而,人们对 b 与网络其他参数之间的关系仍然知之甚少。本文探讨了平衡 b、连通度 K(即每个神经元的突触数)和网络大小(即神经元数 N)之间的相互作用如何影响 RBN 储库的动态和性能(记忆和预测)。我们的研究结果表明,K 和 b 在最佳蓄水池中密切相关。高 K 值水库有两个最佳平衡点,一个是全局抑制性网络(b <0),另一个是兴奋性网络(b >0)。两者在零平衡时表现不对称。相反,对于中等 K(最佳值为 K = 4),当兴奋和抑制几乎(但不是完全)相互平衡时,就会获得最佳蓄水池。对于几乎所有的 K,大小的影响是增加 N 会带来更好的性能,即使 N 的值非常大。我们的研究为生成最佳水库或对大小或连接性有限制的水库提供了明确的方向。
期刊介绍:
Frontiers in Computational Neuroscience is a first-tier electronic journal devoted to promoting theoretical modeling of brain function and fostering interdisciplinary interactions between theoretical and experimental neuroscience. Progress in understanding the amazing capabilities of the brain is still limited, and we believe that it will only come with deep theoretical thinking and mutually stimulating cooperation between different disciplines and approaches. We therefore invite original contributions on a wide range of topics that present the fruits of such cooperation, or provide stimuli for future alliances. We aim to provide an interactive forum for cutting-edge theoretical studies of the nervous system, and for promulgating the best theoretical research to the broader neuroscience community. Models of all styles and at all levels are welcome, from biophysically motivated realistic simulations of neurons and synapses to high-level abstract models of inference and decision making. While the journal is primarily focused on theoretically based and driven research, we welcome experimental studies that validate and test theoretical conclusions.
Also: comp neuro