{"title":"Multilayered Neural Networks With Sparse, Data-driven Connectivity and Balanced Information and Energy Efficiency","authors":"R. Baxter, W. Levy","doi":"10.1109/CISS.2019.8692785","DOIUrl":null,"url":null,"abstract":"This paper studies the developmental dynamics of connectivity and function of a three layer neural network that starts with zero connections. Adaptive synaptogenesis networks combine random synaptogenesis, associative synaptic modification, and synaptic shedding to construct sparse networks that develop codes useful for discriminating input patterns. Empirical observations of brain development inspire several extensions to adaptive synaptogenesis networks. These extensions include: (i) multiple neuronal layers, (ii) neuron survival and death based on information transmission, and (iii) bigrade growth factor signaling to control the onset of synaptogenesis in succeeding layers and to control neuron survival and death in preceding layers. Simulations of the network model demonstrate the parametric and functional control of both performance and energy expenditures, where performance is measured in terms of information loss and classification errors, and energy expenditures are assumed to be a function of the number of neurons. Major insights from this study include (a) the key role a neural layer between two other layers has in controlling synaptogenesis and neuron elimination, (b) the performance and energy-savings benefits of delaying the onset of synaptogenesis in a succeeding layer, and (c) the elimination of neurons is accomplished without significantly degrading information transfer or classification performance while providing energy savings and code compression.","PeriodicalId":123696,"journal":{"name":"2019 53rd Annual Conference on Information Sciences and Systems (CISS)","volume":"43 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 53rd Annual Conference on Information Sciences and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS.2019.8692785","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
This paper studies the developmental dynamics of connectivity and function of a three layer neural network that starts with zero connections. Adaptive synaptogenesis networks combine random synaptogenesis, associative synaptic modification, and synaptic shedding to construct sparse networks that develop codes useful for discriminating input patterns. Empirical observations of brain development inspire several extensions to adaptive synaptogenesis networks. These extensions include: (i) multiple neuronal layers, (ii) neuron survival and death based on information transmission, and (iii) bigrade growth factor signaling to control the onset of synaptogenesis in succeeding layers and to control neuron survival and death in preceding layers. Simulations of the network model demonstrate the parametric and functional control of both performance and energy expenditures, where performance is measured in terms of information loss and classification errors, and energy expenditures are assumed to be a function of the number of neurons. Major insights from this study include (a) the key role a neural layer between two other layers has in controlling synaptogenesis and neuron elimination, (b) the performance and energy-savings benefits of delaying the onset of synaptogenesis in a succeeding layer, and (c) the elimination of neurons is accomplished without significantly degrading information transfer or classification performance while providing energy savings and code compression.