{"title":"基于字典学习和突触归一化的神经回路中的脉冲LCA","authors":"Diego Chavez Arana, Alpha Renner, A. Sornborger","doi":"10.1145/3584954.3584968","DOIUrl":null,"url":null,"abstract":"The Locally Competitive Algorithm (LCA) [17, 18] was put forward as a model of primary visual cortex [14, 17] and has been used extensively as a sparse coding algorithm for multivariate data. LCA has seen implementations on neuromorphic processors, including IBM’s TrueNorth processor [10], and Intel’s neuromorphic research processor, Loihi, which show that it can be very efficient with respect to the power resources it consumes [8]. When combined with dictionary learning [13], the LCA algorithm encounters synaptic instability [24], where, as a synapse’s strength grows, its activity increases, further enhancing synaptic strength, leading to a runaway condition, where synapses become saturated [3, 15]. A number of approaches have been suggested to stabilize this phenomenon [1, 2, 5, 7, 12]. Previous work demonstrated that, by extending the cost function used to generate LCA updates, synaptic normalization could be achieved, eliminating synaptic runaway [7]. It was also shown that the resulting algorithm could be implemented in a firing rate model [7]. Here, we implement a probabilistic approximation to this firing rate model as a spiking LCA algorithm that includes dictionary learning and synaptic normalization. The algorithm is based on a synfire-gated synfire chain-based information control network in concert with Hebbian synapses [16, 19]. We show that this algorithm results in correct classification on numeric data taken from the MNIST dataset. LA-UR-22-33004","PeriodicalId":375527,"journal":{"name":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Spiking LCA in a Neural Circuit with Dictionary Learning and Synaptic Normalization\",\"authors\":\"Diego Chavez Arana, Alpha Renner, A. Sornborger\",\"doi\":\"10.1145/3584954.3584968\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Locally Competitive Algorithm (LCA) [17, 18] was put forward as a model of primary visual cortex [14, 17] and has been used extensively as a sparse coding algorithm for multivariate data. LCA has seen implementations on neuromorphic processors, including IBM’s TrueNorth processor [10], and Intel’s neuromorphic research processor, Loihi, which show that it can be very efficient with respect to the power resources it consumes [8]. When combined with dictionary learning [13], the LCA algorithm encounters synaptic instability [24], where, as a synapse’s strength grows, its activity increases, further enhancing synaptic strength, leading to a runaway condition, where synapses become saturated [3, 15]. A number of approaches have been suggested to stabilize this phenomenon [1, 2, 5, 7, 12]. Previous work demonstrated that, by extending the cost function used to generate LCA updates, synaptic normalization could be achieved, eliminating synaptic runaway [7]. It was also shown that the resulting algorithm could be implemented in a firing rate model [7]. Here, we implement a probabilistic approximation to this firing rate model as a spiking LCA algorithm that includes dictionary learning and synaptic normalization. The algorithm is based on a synfire-gated synfire chain-based information control network in concert with Hebbian synapses [16, 19]. We show that this algorithm results in correct classification on numeric data taken from the MNIST dataset. LA-UR-22-33004\",\"PeriodicalId\":375527,\"journal\":{\"name\":\"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3584954.3584968\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2023 Annual Neuro-Inspired Computational Elements Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3584954.3584968","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Spiking LCA in a Neural Circuit with Dictionary Learning and Synaptic Normalization
The Locally Competitive Algorithm (LCA) [17, 18] was put forward as a model of primary visual cortex [14, 17] and has been used extensively as a sparse coding algorithm for multivariate data. LCA has seen implementations on neuromorphic processors, including IBM’s TrueNorth processor [10], and Intel’s neuromorphic research processor, Loihi, which show that it can be very efficient with respect to the power resources it consumes [8]. When combined with dictionary learning [13], the LCA algorithm encounters synaptic instability [24], where, as a synapse’s strength grows, its activity increases, further enhancing synaptic strength, leading to a runaway condition, where synapses become saturated [3, 15]. A number of approaches have been suggested to stabilize this phenomenon [1, 2, 5, 7, 12]. Previous work demonstrated that, by extending the cost function used to generate LCA updates, synaptic normalization could be achieved, eliminating synaptic runaway [7]. It was also shown that the resulting algorithm could be implemented in a firing rate model [7]. Here, we implement a probabilistic approximation to this firing rate model as a spiking LCA algorithm that includes dictionary learning and synaptic normalization. The algorithm is based on a synfire-gated synfire chain-based information control network in concert with Hebbian synapses [16, 19]. We show that this algorithm results in correct classification on numeric data taken from the MNIST dataset. LA-UR-22-33004