The field of neural ordinary differential equations (ODEs) and neuromorphic computing is rapidly evolving, with a focus on improving the accuracy and efficiency of models in capturing complex dynamics and patterns. Recent developments have seen the introduction of novel approaches, such as Fourier ODEs, which transform time-series data into the frequency domain to uncover global patterns and periodic behaviors. Additionally, there have been advancements in the convergence analysis of continuous-depth graph neural networks, providing theoretical insights into their size transferability. The use of manifold-constrained neural ODEs has also shown promise in addressing the challenges of high-dimensional datasets. Furthermore, the application of neural ODEs in physical-layer signal processing for next-generation MIMO systems has demonstrated superior performance compared to state-of-the-art methods. Noteworthy papers include the proposal of Fourier Ordinary Differential Equations, which outperform existing methods in terms of accuracy and efficiency, and the introduction of Lagrangian neural ODEs, which allow for the direct learning of Euler-Lagrange equations with zero additional inference cost. Overall, these advancements are paving the way for more accurate and efficient models in a range of applications, from time-series analysis to physical-layer signal processing.