Adaptive Spiking Neural Networks with Hybrid Coding

Huaxu He
{"title":"Adaptive Spiking Neural Networks with Hybrid Coding","authors":"Huaxu He","doi":"arxiv-2408.12407","DOIUrl":null,"url":null,"abstract":"The Spiking Neural Network (SNN), due to its unique spiking-driven nature, is\na more energy-efficient and effective neural network compared to Artificial\nNeural Networks (ANNs). The encoding method directly influences the overall\nperformance of the network, and currently, direct encoding is primarily used\nfor directly trained SNNs. When working with static image datasets, direct\nencoding inputs the same feature map at every time step, failing to fully\nexploit the spatiotemporal properties of SNNs. While temporal encoding converts\ninput data into spike trains with spatiotemporal characteristics, traditional\nSNNs utilize the same neurons when processing input data across different time\nsteps, limiting their ability to integrate and utilize spatiotemporal\ninformation effectively.To address this, this paper employs temporal encoding\nand proposes the Adaptive Spiking Neural Network (ASNN), enhancing the\nutilization of temporal encoding in conventional SNNs. Additionally, temporal\nencoding is less frequently used because short time steps can lead to\nsignificant loss of input data information, often necessitating a higher number\nof time steps in practical applications. However, training large SNNs with long\ntime steps is challenging due to hardware constraints. To overcome this, this\npaper introduces a hybrid encoding approach that not only reduces the required\ntime steps for training but also continues to improve the overall network\nperformance.Notably, significant improvements in classification performance are\nobserved on both Spikformer and Spiking ResNet architectures.our code is\navailable at https://github.com/hhx0320/ASNN","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"7 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.12407","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The Spiking Neural Network (SNN), due to its unique spiking-driven nature, is a more energy-efficient and effective neural network compared to Artificial Neural Networks (ANNs). The encoding method directly influences the overall performance of the network, and currently, direct encoding is primarily used for directly trained SNNs. When working with static image datasets, direct encoding inputs the same feature map at every time step, failing to fully exploit the spatiotemporal properties of SNNs. While temporal encoding converts input data into spike trains with spatiotemporal characteristics, traditional SNNs utilize the same neurons when processing input data across different time steps, limiting their ability to integrate and utilize spatiotemporal information effectively.To address this, this paper employs temporal encoding and proposes the Adaptive Spiking Neural Network (ASNN), enhancing the utilization of temporal encoding in conventional SNNs. Additionally, temporal encoding is less frequently used because short time steps can lead to significant loss of input data information, often necessitating a higher number of time steps in practical applications. However, training large SNNs with long time steps is challenging due to hardware constraints. To overcome this, this paper introduces a hybrid encoding approach that not only reduces the required time steps for training but also continues to improve the overall network performance.Notably, significant improvements in classification performance are observed on both Spikformer and Spiking ResNet architectures.our code is available at https://github.com/hhx0320/ASNN
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
混合编码的自适应尖峰神经网络
尖峰神经网络(SNN)由于其独特的尖峰驱动特性,与人工神经网络(ANN)相比是一种更节能、更有效的神经网络。编码方法直接影响网络的整体性能,目前,直接编码主要用于直接训练的 SNN。在处理静态图像数据集时,直接编码会在每个时间步输入相同的特征图,无法充分发挥 SNN 的时空特性。虽然时态编码将输入数据转换为具有时空特性的尖峰列车,但传统 SNN 在处理不同时间步的输入数据时使用相同的神经元,从而限制了其有效整合和利用时空信息的能力。此外,由于短时间步长会导致输入数据信息的大量丢失,在实际应用中往往需要更多的时间步长,因此时间编码较少使用。然而,由于硬件限制,用较长的时间步长训练大型 SNN 是一项挑战。为了克服这一问题,本文介绍了一种混合编码方法,它不仅减少了训练所需的时间步长,还能继续提高网络的整体性能。值得注意的是,在 Spikformer 和 Spiking ResNet 架构上,分类性能都有显著提高。我们的代码可在 https://github.com/hhx0320/ASNN 上获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Hardware-Friendly Implementation of Physical Reservoir Computing with CMOS-based Time-domain Analog Spiking Neurons Self-Contrastive Forward-Forward Algorithm Bio-Inspired Mamba: Temporal Locality and Bioplausible Learning in Selective State Space Models PReLU: Yet Another Single-Layer Solution to the XOR Problem Inferno: An Extensible Framework for Spiking Neural Networks
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1