The field of neuromorphic computing and temporal modeling is rapidly advancing, with a focus on developing energy-efficient and low-latency signal processing systems. Recent research has explored the use of spiking neural networks (SNNs) and state space models (SSMs) to improve the performance of various tasks, including keyword spotting, event-based vision, and sequential pattern recognition. The development of novel neuron models and conversion frameworks has enabled the creation of high-performance SNNs and SSMs, which can be used for a range of applications, including speech processing and computer vision. Additionally, advances in hardware design have led to the creation of efficient and scalable neuromorphic computing systems, including those based on ferroelectric capacitors and reservoir computing. Notable papers in this area include: Low-Bit Data Processing Using Multiple-Output Spiking Neurons with Non-linear Reset Feedback, which proposes a novel neuron model for SNNs. Training-Free ANN-to-SNN Conversion for High-Performance Spiking Transformer, which introduces a high-performance conversion framework for Transformer architectures. eMamba: Efficient Acceleration Framework for Mamba Models in Edge Computing, which presents a comprehensive framework for deploying Mamba models on edge platforms.