Command Palette
Search for a command to run...
Robust and accelerated single-spike spiking neural network training with applicability to challenging temporal tasks
Luke Taylor Andrew King Nicol Harper

Abstract
Spiking neural networks (SNNs), particularly the single-spike variant in which neurons spike at most once, are considerably more energy efficient than standard artificial neural networks (ANNs). However, single-spike SSNs are difficult to train due to their dynamic and non-differentiable nature, where current solutions are either slow or suffer from training instabilities. These networks have also been critiqued for their limited computational applicability such as being unsuitable for time-series datasets. We propose a new model for training single-spike SNNs which mitigates the aforementioned training issues and obtains competitive results across various image and neuromorphic datasets, with up to a $13.98\times$ training speedup and up to an $81\%$ reduction in spikes compared to the multi-spike SNN. Notably, our model performs on par with multi-spike SNNs in challenging tasks involving neuromorphic time-series datasets, demonstrating a broader computational role for single-spike SNNs than previously believed.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| audio-classification-on-shd | FastSNN | Percentage correct: 70.32 |
| image-classification-on-fashion-mnist | FastSNN (MLP) | Accuracy: 89.05 |
| image-classification-on-fashion-mnist | FastSNN (CNN) | Accuracy: 90.57 |
| image-classification-on-mnist | FastSNN (CNN) | Accuracy: 99.3 |
| image-classification-on-mnist | FastSNN (MLP) | Accuracy: 97.91 |
| image-classification-on-n-mnist | FastSNN | Accuracy: 95.91 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.