Advances in Neuromorphic Computing and Spiking Neural Networks

The field of neuromorphic computing and spiking neural networks (SNNs) is rapidly advancing, with a focus on developing more efficient, scalable, and biologically-inspired models. Recent research has explored the use of stochastic equilibrium propagation, parallelism in FPGA-based accelerators, and compression and inference techniques for SNNs on resource-constrained hardware. These innovations aim to improve the performance and energy efficiency of SNNs, making them more suitable for deployment on edge devices and in real-world applications. Notable papers in this area include the proposal of a stochastic EP framework for training SNNs, which achieves state-of-the-art performance on vision benchmarks while preserving locality, and the development of a lightweight C-based runtime for SNN inference on edge devices, which enables efficient execution of SNNs on conventional embedded platforms. Other notable papers include the introduction of SpikeNM, a semi-structured N:M pruning framework for SNNs, and the proposal of PACE, a dataset distillation framework for fast SNN training. Overall, these advances demonstrate the potential of neuromorphic computing and SNNs to enable efficient, low-power, and adaptive intelligence in a wide range of applications.

Sources

StochEP: Stochastic Equilibrium Propagation for Spiking Convergent Recurrent Neural Networks

Exploring Parallelism in FPGA-Based Accelerators for Machine Learning Applications

Compression and Inference of Spiking Neural Networks on Resource-Constrained Hardware

Sparse by Rule: Probability-Based N:M Pruning for Spiking Neural Networks

Learning from Dense Events: Towards Fast Spiking Neural Networks Training via Event Dataset Distillatio

LILogic Net: Compact Logic Gate Networks with Learnable Connectivity for Efficient Hardware Deployment

BSO: Binary Spiking Online Optimization Algorithm

SynapticCore-X: A Modular Neural Processing Architecture for Low-Cost FPGA Acceleration

DS-ATGO: Dual-Stage Synergistic Learning via Forward Adaptive Threshold and Backward Gradient Optimization for Spiking Neural Networks

MS2Edge: Towards Energy-Efficient and Crisp Edge Detection with Multi-Scale Residual Learning in SNNs

Attention via Synaptic Plasticity is All You Need: A Biologically Inspired Spiking Neuromorphic Transformer

DustNet: A Wireless Network of Ultrasonic Neural Implants

Neuromorphic Astronomy: An End-to-End SNN Pipeline for RFI Detection Hardware

Built with on top of