{"title":"Stepwise Weighted Spike Coding for Deep Spiking Neural Networks","authors":"Yiwen Gu, Junchuan Gu, Haibin Shen, Kejie Huang","doi":"arxiv-2408.17245","DOIUrl":null,"url":null,"abstract":"Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of\nbiological neurons and are expected to play a key role in the advancement of\nneural computing and artificial intelligence. The efficiency of SNNs is often\ndetermined by the neural coding schemes. Existing coding schemes either cause\nhuge delays and energy consumption or necessitate intricate neuron models and\ntraining techniques. To address these issues, we propose a novel Stepwise\nWeighted Spike (SWS) coding scheme to enhance the encoding of information in\nspikes. This approach compresses the spikes by weighting the significance of\nthe spike in each step of neural computation, achieving high performance and\nlow energy consumption. A Ternary Self-Amplifying (TSA) neuron model with a\nsilent period is proposed for supporting SWS-based computing, aimed at\nminimizing the residual error resulting from stepwise weighting in neural\ncomputation. Our experimental results show that the SWS coding scheme\noutperforms the existing neural coding schemes in very deep SNNs, and\nsignificantly reduces operations and latency.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"9 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.17245","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Spiking Neural Networks (SNNs) seek to mimic the spiking behavior of
biological neurons and are expected to play a key role in the advancement of
neural computing and artificial intelligence. The efficiency of SNNs is often
determined by the neural coding schemes. Existing coding schemes either cause
huge delays and energy consumption or necessitate intricate neuron models and
training techniques. To address these issues, we propose a novel Stepwise
Weighted Spike (SWS) coding scheme to enhance the encoding of information in
spikes. This approach compresses the spikes by weighting the significance of
the spike in each step of neural computation, achieving high performance and
low energy consumption. A Ternary Self-Amplifying (TSA) neuron model with a
silent period is proposed for supporting SWS-based computing, aimed at
minimizing the residual error resulting from stepwise weighting in neural
computation. Our experimental results show that the SWS coding scheme
outperforms the existing neural coding schemes in very deep SNNs, and
significantly reduces operations and latency.