{"title":"Adaptive structure generation and neuronal differentiation for memory encoding in SNNs","authors":"","doi":"10.1016/j.neucom.2024.128470","DOIUrl":null,"url":null,"abstract":"<div><p>Memory is the core of cognition. The exploration of the memory encoding mechanism or the representation mechanism of information in the Spiking Neural Network (SNN) is the basis for the in-depth study of memory. In this paper, we study the memory encoding mechanism of multilayer SNN models from a biomimetic perspective and explore a method using the high biological likelihood of SNN to enable the network to effectively simulate memory effects. We proposed a series of heuristic neuron-growing connection algorithms and supervised network weight learning algorithms, which were applied to the unsupervised and supervised training process of the presentation layer. These methods optimized the structure of the representation layer, achieved functional differentiation of neurons, and enabled the network to generate differentiated representations for different data modes. Under our algorithm, the proposed model achieves stable convergence with identical pattern inputs, demonstrating distinct representations and sensitivities to different visual modalities. To achieve stable information expression within the network, we conducted various comparative experiments to determine diverse parameters of the complex network. This paper contributes to the development of Brain-inspired Intelligence by bridging the gap between computer science and neuroscience by using simulations to validate biological hypotheses and guide machine learning.</p></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":null,"pages":null},"PeriodicalIF":5.5000,"publicationDate":"2024-08-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0925231224012414/pdfft?md5=3e719464f846393c8980b4a91dc9f801&pid=1-s2.0-S0925231224012414-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231224012414","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Memory is the core of cognition. The exploration of the memory encoding mechanism or the representation mechanism of information in the Spiking Neural Network (SNN) is the basis for the in-depth study of memory. In this paper, we study the memory encoding mechanism of multilayer SNN models from a biomimetic perspective and explore a method using the high biological likelihood of SNN to enable the network to effectively simulate memory effects. We proposed a series of heuristic neuron-growing connection algorithms and supervised network weight learning algorithms, which were applied to the unsupervised and supervised training process of the presentation layer. These methods optimized the structure of the representation layer, achieved functional differentiation of neurons, and enabled the network to generate differentiated representations for different data modes. Under our algorithm, the proposed model achieves stable convergence with identical pattern inputs, demonstrating distinct representations and sensitivities to different visual modalities. To achieve stable information expression within the network, we conducted various comparative experiments to determine diverse parameters of the complex network. This paper contributes to the development of Brain-inspired Intelligence by bridging the gap between computer science and neuroscience by using simulations to validate biological hypotheses and guide machine learning.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.