Sparsity in Reservoir Computing Neural Networks

C. Gallicchio
{"title":"Sparsity in Reservoir Computing Neural Networks","authors":"C. Gallicchio","doi":"10.1109/INISTA49547.2020.9194611","DOIUrl":null,"url":null,"abstract":"Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured by striking efficiency of training. The crucial aspect of RC is to properly instantiate the hidden recurrent layer that serves as dynamical memory to the system. In this respect, the common recipe is to create a pool of randomly and sparsely connected recurrent neurons. While the aspect of sparsity in the design of RC systems has been debated in the literature, it is nowadays understood mainly as a way to enhance the efficiency of computation, exploiting sparse matrix operations. In this paper, we empirically investigate the role of sparsity in RC network design under the perspective of the richness of the developed temporal representations. We analyze both sparsity in the recurrent connections, and in the connections from the input to the reservoir. Our results point out that sparsity, in particular in input-reservoir connections, has a major role in developing internal temporal representations that have a longer short-term memory of past inputs and a higher dimension.","PeriodicalId":124632,"journal":{"name":"2020 International Conference on INnovations in Intelligent SysTems and Applications (INISTA)","volume":"254 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on INnovations in Intelligent SysTems and Applications (INISTA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INISTA49547.2020.9194611","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

Reservoir Computing (RC) is a well-known strategy for designing Recurrent Neural Networks featured by striking efficiency of training. The crucial aspect of RC is to properly instantiate the hidden recurrent layer that serves as dynamical memory to the system. In this respect, the common recipe is to create a pool of randomly and sparsely connected recurrent neurons. While the aspect of sparsity in the design of RC systems has been debated in the literature, it is nowadays understood mainly as a way to enhance the efficiency of computation, exploiting sparse matrix operations. In this paper, we empirically investigate the role of sparsity in RC network design under the perspective of the richness of the developed temporal representations. We analyze both sparsity in the recurrent connections, and in the connections from the input to the reservoir. Our results point out that sparsity, in particular in input-reservoir connections, has a major role in developing internal temporal representations that have a longer short-term memory of past inputs and a higher dimension.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
储层计算神经网络的稀疏性
水库计算(RC)是一种众所周知的设计递归神经网络的策略,具有显著的训练效率。RC的关键方面是适当地实例化作为系统动态存储器的隐藏循环层。在这方面,常见的方法是创建一个随机且稀疏连接的循环神经元池。虽然文献中对RC系统设计中的稀疏性存在争议,但目前人们主要将其理解为利用稀疏矩阵运算来提高计算效率的一种方法。在本文中,我们从时间表征的丰富性角度实证研究了稀疏性在RC网络设计中的作用。我们分析了循环连接的稀疏性,以及从输入到储层的连接。我们的研究结果指出,稀疏性,特别是在输入-存储连接中,在发展对过去输入具有更长的短期记忆和更高维度的内部时间表征方面发挥了重要作用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Sentiment Analysis Based Churn Prediction in Mobile Games using Word Embedding Models and Deep Learning Algorithms Factual Question Generation for the Portuguese Language An MQTT-based Resource Management Framework for Edge Computing Systems A multilevel mapping based pedestrian model for social robot navigation tasks in unknown human environments How to Segment Turkish Words for Neural Text Classification?
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1