{"title":"CS-QCFS: Bridging the performance gap in ultra-low latency spiking neural networks.","authors":"Hongchao Yang, Suorong Yang, Lingming Zhang, Hui Dou, Furao Shen, Jian Zhao","doi":"10.1016/j.neunet.2024.107076","DOIUrl":null,"url":null,"abstract":"<p><p>Spiking Neural Networks (SNNs) are at the forefront of computational neuroscience, emulating the nuanced dynamics of biological systems. In the realm of SNN training methods, the conversion from ANNs to SNNs has generated significant interest due to its potential for creating energy-efficient and biologically plausible models. However, existing conversion methods often require long time-steps to ensure that the converted SNNs achieve performance comparable to the original ANNs. In this paper, we thoroughly investigate the process of ANN-SNN conversion and identify two critical issues: the frequently overlooked heterogeneity across channels and the emergence of negative thresholds, both of which lead to the problem of long time-steps. To address these issues, we introduce an innovative activation function called Channel-wise Softplus Quantization Clip-Floor-Shift (CS-QCFS) activation function. This function effectively handles the disparities between channels and maintain positive thresholds. This innovation enables us to achieve high-performance SNNs, particularly in ultra-low time-steps. Our experimental results demonstrate that the proposed method achieves state-of-the-art performance on CIFAR datasets. For instance, we achieve a top-1 accuracy of 95.86% on CIFAR-10 and 74.83% on CIFAR-100 with only 1 time-step.</p>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"184 ","pages":"107076"},"PeriodicalIF":6.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1016/j.neunet.2024.107076","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Spiking Neural Networks (SNNs) are at the forefront of computational neuroscience, emulating the nuanced dynamics of biological systems. In the realm of SNN training methods, the conversion from ANNs to SNNs has generated significant interest due to its potential for creating energy-efficient and biologically plausible models. However, existing conversion methods often require long time-steps to ensure that the converted SNNs achieve performance comparable to the original ANNs. In this paper, we thoroughly investigate the process of ANN-SNN conversion and identify two critical issues: the frequently overlooked heterogeneity across channels and the emergence of negative thresholds, both of which lead to the problem of long time-steps. To address these issues, we introduce an innovative activation function called Channel-wise Softplus Quantization Clip-Floor-Shift (CS-QCFS) activation function. This function effectively handles the disparities between channels and maintain positive thresholds. This innovation enables us to achieve high-performance SNNs, particularly in ultra-low time-steps. Our experimental results demonstrate that the proposed method achieves state-of-the-art performance on CIFAR datasets. For instance, we achieve a top-1 accuracy of 95.86% on CIFAR-10 and 74.83% on CIFAR-100 with only 1 time-step.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.