Spiking Neural Networks: Advances in Efficiency and Temporal Learning

The field of Spiking Neural Networks (SNNs) is moving towards improving energy efficiency and temporal learning capabilities. Researchers are exploring novel techniques to optimize SNN performance, including principled hyperparameter tuning and surrogate gradient descent methods. These advancements enable SNNs to learn from precise spike timing and improve their robustness to adversarial noise. Furthermore, new datasets and benchmarks are being introduced to facilitate the development of energy-efficient video understanding and action recognition tasks using spike-based data. Noteworthy papers include:

  • Beyond Rate Coding: Surrogate Gradients Enable Spike Timing Learning in Spiking Neural Networks, which demonstrates the ability of SNNs to learn from precise spike timing.
  • Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training, which proposes a novel fractional-order SNN model that captures long-term dependencies in membrane voltage and spike trains.

Sources

Analyzing Internal Activity and Robustness of SNNs Across Neuron Parameter Space

Beyond Rate Coding: Surrogate Gradients Enable Spike Timing Learning in Spiking Neural Networks

SPACT18: Spiking Human Action Recognition Benchmark Dataset with Complementary RGB and Thermal Modalities

Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training

Built with on top of