{"title":"Adaptive Spiking Neural Networks with Hybrid Coding","authors":"Huaxu He","doi":"arxiv-2408.12407","DOIUrl":null,"url":null,"abstract":"The Spiking Neural Network (SNN), due to its unique spiking-driven nature, is\na more energy-efficient and effective neural network compared to Artificial\nNeural Networks (ANNs). The encoding method directly influences the overall\nperformance of the network, and currently, direct encoding is primarily used\nfor directly trained SNNs. When working with static image datasets, direct\nencoding inputs the same feature map at every time step, failing to fully\nexploit the spatiotemporal properties of SNNs. While temporal encoding converts\ninput data into spike trains with spatiotemporal characteristics, traditional\nSNNs utilize the same neurons when processing input data across different time\nsteps, limiting their ability to integrate and utilize spatiotemporal\ninformation effectively.To address this, this paper employs temporal encoding\nand proposes the Adaptive Spiking Neural Network (ASNN), enhancing the\nutilization of temporal encoding in conventional SNNs. Additionally, temporal\nencoding is less frequently used because short time steps can lead to\nsignificant loss of input data information, often necessitating a higher number\nof time steps in practical applications. However, training large SNNs with long\ntime steps is challenging due to hardware constraints. To overcome this, this\npaper introduces a hybrid encoding approach that not only reduces the required\ntime steps for training but also continues to improve the overall network\nperformance.Notably, significant improvements in classification performance are\nobserved on both Spikformer and Spiking ResNet architectures.our code is\navailable at https://github.com/hhx0320/ASNN","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"7 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.12407","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The Spiking Neural Network (SNN), due to its unique spiking-driven nature, is
a more energy-efficient and effective neural network compared to Artificial
Neural Networks (ANNs). The encoding method directly influences the overall
performance of the network, and currently, direct encoding is primarily used
for directly trained SNNs. When working with static image datasets, direct
encoding inputs the same feature map at every time step, failing to fully
exploit the spatiotemporal properties of SNNs. While temporal encoding converts
input data into spike trains with spatiotemporal characteristics, traditional
SNNs utilize the same neurons when processing input data across different time
steps, limiting their ability to integrate and utilize spatiotemporal
information effectively.To address this, this paper employs temporal encoding
and proposes the Adaptive Spiking Neural Network (ASNN), enhancing the
utilization of temporal encoding in conventional SNNs. Additionally, temporal
encoding is less frequently used because short time steps can lead to
significant loss of input data information, often necessitating a higher number
of time steps in practical applications. However, training large SNNs with long
time steps is challenging due to hardware constraints. To overcome this, this
paper introduces a hybrid encoding approach that not only reduces the required
time steps for training but also continues to improve the overall network
performance.Notably, significant improvements in classification performance are
observed on both Spikformer and Spiking ResNet architectures.our code is
available at https://github.com/hhx0320/ASNN