{"title":"Sparse learning enabled by constraints on connectivity and function","authors":"Mirza M. Junaid Baig, Armen Stepanyants","doi":"arxiv-2409.04946","DOIUrl":null,"url":null,"abstract":"Sparse connectivity is a hallmark of the brain and a desired property of\nartificial neural networks. It promotes energy efficiency, simplifies training,\nand enhances the robustness of network function. Thus, a detailed understanding\nof how to achieve sparsity without jeopardizing network performance is\nbeneficial for neuroscience, deep learning, and neuromorphic computing\napplications. We used an exactly solvable model of associative learning to\nevaluate the effects of various sparsity-inducing constraints on connectivity\nand function. We determine the optimal level of sparsity achieved by the $l_0$\nnorm constraint and find that nearly the same efficiency can be obtained by\neliminating weak connections. We show that this method of achieving sparsity\ncan be implemented online, making it compatible with neuroscience and machine\nlearning applications.","PeriodicalId":501517,"journal":{"name":"arXiv - QuanBio - Neurons and Cognition","volume":"27 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.04946","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Sparse connectivity is a hallmark of the brain and a desired property of
artificial neural networks. It promotes energy efficiency, simplifies training,
and enhances the robustness of network function. Thus, a detailed understanding
of how to achieve sparsity without jeopardizing network performance is
beneficial for neuroscience, deep learning, and neuromorphic computing
applications. We used an exactly solvable model of associative learning to
evaluate the effects of various sparsity-inducing constraints on connectivity
and function. We determine the optimal level of sparsity achieved by the $l_0$
norm constraint and find that nearly the same efficiency can be obtained by
eliminating weak connections. We show that this method of achieving sparsity
can be implemented online, making it compatible with neuroscience and machine
learning applications.