The field of spiking neural networks (SNNs) is rapidly advancing, with a focus on improving energy efficiency while maintaining performance. Recent developments have led to the creation of novel training methods, such as residual learning and spike-aware data pruning, which enable efficient training of SNNs on limited computational resources. Additionally, knowledge distillation techniques have been applied to transfer the accuracy of large language models to SNNs, resulting in significant energy benefits. The integration of SNNs with neuromorphic hardware has also shown promise in reducing energy consumption. Furthermore, researchers have made progress in addressing security concerns, such as backdoor attacks, in SNNs. Noteworthy papers in this area include: In-memory Training on Analog Devices with Limited Conductance States via Multi-tile Residual Learning, which proposes a residual learning framework to enable on-chip training with limited-state devices. SAFA-SNN: Sparsity-Aware On-Device Few-Shot Class-Incremental Learning with Fast-Adaptive Structure of Spiking Neural Network, which presents an SNN-based method for on-device few-shot class-incremental learning, achieving state-of-the-art performance while reducing energy consumption. Efficient Training of Spiking Neural Networks by Spike-aware Data Pruning, which introduces a novel spike-aware data pruning method to accelerate SNN training, achieving significant training speedups while maintaining accuracy. SpikingMamba: Towards Energy-Efficient Large Language Models via Knowledge Distillation from Mamba, which proposes an energy-efficient SNN-based large language model distilled from Mamba, achieving a 4.76x energy benefit with minimal accuracy sacrifice. Unsupervised Backdoor Detection and Mitigation for Spiking Neural Networks, which identifies key blockers hindering traditional backdoor defenses in SNNs and proposes an unsupervised post-training detection framework to overcome these challenges. Vacuum Spiker: A Spiking Neural Network-Based Model for Efficient Anomaly Detection in Time Series, which introduces a novel SNN-based method for anomaly detection in time series, achieving competitive performance while significantly reducing energy consumption.