Sparse learning enabled by constraints on connectivity and function

Mirza M. Junaid Baig, Armen Stepanyants
{"title":"Sparse learning enabled by constraints on connectivity and function","authors":"Mirza M. Junaid Baig, Armen Stepanyants","doi":"arxiv-2409.04946","DOIUrl":null,"url":null,"abstract":"Sparse connectivity is a hallmark of the brain and a desired property of\nartificial neural networks. It promotes energy efficiency, simplifies training,\nand enhances the robustness of network function. Thus, a detailed understanding\nof how to achieve sparsity without jeopardizing network performance is\nbeneficial for neuroscience, deep learning, and neuromorphic computing\napplications. We used an exactly solvable model of associative learning to\nevaluate the effects of various sparsity-inducing constraints on connectivity\nand function. We determine the optimal level of sparsity achieved by the $l_0$\nnorm constraint and find that nearly the same efficiency can be obtained by\neliminating weak connections. We show that this method of achieving sparsity\ncan be implemented online, making it compatible with neuroscience and machine\nlearning applications.","PeriodicalId":501517,"journal":{"name":"arXiv - QuanBio - Neurons and Cognition","volume":"27 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.04946","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Sparse connectivity is a hallmark of the brain and a desired property of artificial neural networks. It promotes energy efficiency, simplifies training, and enhances the robustness of network function. Thus, a detailed understanding of how to achieve sparsity without jeopardizing network performance is beneficial for neuroscience, deep learning, and neuromorphic computing applications. We used an exactly solvable model of associative learning to evaluate the effects of various sparsity-inducing constraints on connectivity and function. We determine the optimal level of sparsity achieved by the $l_0$ norm constraint and find that nearly the same efficiency can be obtained by eliminating weak connections. We show that this method of achieving sparsity can be implemented online, making it compatible with neuroscience and machine learning applications.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
通过对连接性和功能的限制实现稀疏学习
稀疏连接是大脑的标志,也是人工神经网络的理想特性。它能提高能效、简化训练并增强网络功能的鲁棒性。因此,详细了解如何在不影响网络性能的情况下实现稀疏性,对神经科学、深度学习和神经形态计算应用都是有益的。我们利用关联学习的精确可解模型来评估各种稀疏性约束对连接性和功能的影响。我们确定了通过 l_0$norm 约束实现的最佳稀疏程度,并发现通过消除弱连接可以获得几乎相同的效率。我们证明,这种实现稀疏性的方法可以在线实现,使其与神经科学和机器学习应用兼容。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Early reduced dopaminergic tone mediated by D3 receptor and dopamine transporter in absence epileptogenesis Contrasformer: A Brain Network Contrastive Transformer for Neurodegenerative Condition Identification Identifying Influential nodes in Brain Networks via Self-Supervised Graph-Transformer Contrastive Learning in Memristor-based Neuromorphic Systems Self-Attention Limits Working Memory Capacity of Transformer-Based Models
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1