Spiking Neural Networks Efficiency and Robustness

The field of Spiking Neural Networks (SNNs) is moving towards improving efficiency and robustness. Researchers are exploring innovative methods to reduce computational energy consumption and latency, while maintaining or improving accuracy. One notable direction is the development of single-timestep SNNs, which can significantly reduce energy consumption and inference latency. Another area of focus is improving the robustness of SNNs to adversarial perturbations, with techniques such as robust temporal self-ensemble and timestep-compressed attacks. Additionally, there is a growing interest in adaptive computation time methods, such as Spatio-Temporal Adaptive Computation Time (STAS), to mitigate the high latency and computational overhead of SNNs. Noteworthy papers include: SDSNN, which proposes a single-timestep SNN with self-dropping neuron and Bayesian optimization, achieving state-of-the-art accuracy and reduced energy consumption. STAS, which introduces an integrated spike patch splitting module and adaptive spiking self-attention module to reduce energy consumption and improve accuracy. Quantization Meets Spikes, which achieves nearly lossless ANN-to-SNN conversion at the first timestep using Polarity Multi-Spike Mapping and a hyperparameter adjustment strategy.

Sources

SDSNN: A Single-Timestep Spiking Neural Network with Self-Dropping Neuron and Bayesian Optimization

Boosting the Robustness-Accuracy Trade-off of SNNs by Robust Temporal Self-Ensemble

Timestep-Compressed Attack on Spiking Neural Networks through Timestep-Level Backpropagation

STAS: Spatio-Temporal Adaptive Computation Time for Spiking Transformers

Quantization Meets Spikes: Lossless Conversion in the First Timestep via Polarity Multi-Spike Mapping

Built with on top of