R. Rybka, Yury Davydov, Danila Vlasov, A. Serenko, A. Sboev, Vyacheslav Ilyin
{"title":"Comparison of Bagging and Sparcity Methods for Connectivity Reduction in Spiking Neural Networks with Memristive Plasticity","authors":"R. Rybka, Yury Davydov, Danila Vlasov, A. Serenko, A. Sboev, Vyacheslav Ilyin","doi":"10.3390/bdcc8030022","DOIUrl":null,"url":null,"abstract":"Developing a spiking neural network architecture that could prospectively be trained on energy-efficient neuromorphic hardware to solve various data analysis tasks requires satisfying the limitations of prospective analog or digital hardware, i.e., local learning and limited numbers of connections, respectively. In this work, we compare two methods of connectivity reduction that are applicable to spiking networks with local plasticity; instead of a large fully-connected network (which is used as the baseline for comparison), we employ either an ensemble of independent small networks or a network with probabilistic sparse connectivity. We evaluate both of these methods with a three-layer spiking neural network, which are applied to handwritten and spoken digit classification tasks using two memristive plasticity models and the classical spike time-dependent plasticity (STDP) rule. Both methods achieve an F1-score of 0.93–0.95 on the handwritten digits recognition task and 0.85–0.93 on the spoken digits recognition task. Applying a combination of both methods made it possible to obtain highly accurate models while reducing the number of connections by more than three times compared to the basic model.","PeriodicalId":505155,"journal":{"name":"Big Data and Cognitive Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-02-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Big Data and Cognitive Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/bdcc8030022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Developing a spiking neural network architecture that could prospectively be trained on energy-efficient neuromorphic hardware to solve various data analysis tasks requires satisfying the limitations of prospective analog or digital hardware, i.e., local learning and limited numbers of connections, respectively. In this work, we compare two methods of connectivity reduction that are applicable to spiking networks with local plasticity; instead of a large fully-connected network (which is used as the baseline for comparison), we employ either an ensemble of independent small networks or a network with probabilistic sparse connectivity. We evaluate both of these methods with a three-layer spiking neural network, which are applied to handwritten and spoken digit classification tasks using two memristive plasticity models and the classical spike time-dependent plasticity (STDP) rule. Both methods achieve an F1-score of 0.93–0.95 on the handwritten digits recognition task and 0.85–0.93 on the spoken digits recognition task. Applying a combination of both methods made it possible to obtain highly accurate models while reducing the number of connections by more than three times compared to the basic model.
开发可在高能效神经形态硬件上进行前瞻性训练的尖峰神经网络架构,以解决各种数据分析任务,需要满足前瞻性模拟或数字硬件的限制,即局部学习和有限的连接数。在这项工作中,我们比较了两种适用于具有局部可塑性的尖峰网络的连接性降低方法;我们采用了独立小型网络集合或具有概率稀疏连接性的网络,而不是大型全连接网络(用作比较基准)。我们利用三层尖峰神经网络对这两种方法进行了评估,并使用两种记忆可塑性模型和经典的尖峰时间可塑性(STDP)规则将其应用于手写和口语数字分类任务。两种方法在手写数字识别任务中的 F1 分数都达到了 0.93-0.95,在口语数字识别任务中的 F1 分数都达到了 0.85-0.93。与基本模型相比,将这两种方法结合使用可以在减少三倍以上连接数的同时获得高精度模型。