{"title":"SNNAX -- Spiking Neural Networks in JAX","authors":"Jamie Lohoff, Jan Finkbeiner, Emre Neftci","doi":"arxiv-2409.02842","DOIUrl":null,"url":null,"abstract":"Spiking Neural Networks (SNNs) simulators are essential tools to prototype\nbiologically inspired models and neuromorphic hardware architectures and\npredict their performance. For such a tool, ease of use and flexibility are\ncritical, but so is simulation speed especially given the complexity inherent\nto simulating SNN. Here, we present SNNAX, a JAX-based framework for simulating\nand training such models with PyTorch-like intuitiveness and JAX-like execution\nspeed. SNNAX models are easily extended and customized to fit the desired model\nspecifications and target neuromorphic hardware. Additionally, SNNAX offers key\nfeatures for optimizing the training and deployment of SNNs such as flexible\nautomatic differentiation and just-in-time compilation. We evaluate and compare\nSNNAX to other commonly used machine learning (ML) frameworks used for\nprogramming SNNs. We provide key performance metrics, best practices,\ndocumented examples for simulating SNNs in SNNAX, and implement several\nbenchmarks used in the literature.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"28 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.02842","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Spiking Neural Networks (SNNs) simulators are essential tools to prototype
biologically inspired models and neuromorphic hardware architectures and
predict their performance. For such a tool, ease of use and flexibility are
critical, but so is simulation speed especially given the complexity inherent
to simulating SNN. Here, we present SNNAX, a JAX-based framework for simulating
and training such models with PyTorch-like intuitiveness and JAX-like execution
speed. SNNAX models are easily extended and customized to fit the desired model
specifications and target neuromorphic hardware. Additionally, SNNAX offers key
features for optimizing the training and deployment of SNNs such as flexible
automatic differentiation and just-in-time compilation. We evaluate and compare
SNNAX to other commonly used machine learning (ML) frameworks used for
programming SNNs. We provide key performance metrics, best practices,
documented examples for simulating SNNs in SNNAX, and implement several
benchmarks used in the literature.