Integrating Physics and Mathematics in Neural Networks

The field of neural networks is moving towards a more integrated approach, combining concepts from physics and mathematics to improve performance and interpretability. Researchers are exploring the use of physical systems, such as springs and sticks, to model learning processes and understand the thermodynamic properties of these systems. Additionally, there is a growing interest in using symplectic neural networks, which preserve the symplectic form of Hamiltonian systems, to ensure energy conservation and physical consistency. Noteworthy papers in this area include:

  • The proposal of a Kolmogorov-Arnold Representation-based Hamiltonian Neural Network, which replaces Multilayer Perceptrons with univariate transformations to better capture high-frequency and multi-scale dynamics.
  • The introduction of a symplectic convolutional neural network architecture, which leverages symplectic neural networks and proper symplectic decomposition to ensure that the convolution layer remains symplectic.

Sources

Tessellation Groups, Harmonic Analysis on Non-compact Symmetric Spaces and the Heat Kernel in view of Cartan Convolutional Neural Networks

Learning with springs and sticks

Saddle Hierarchy in Dense Associative Memory

Kolmogorov-Arnold Representation for Symplectic Learning: Advancing Hamiltonian Neural Networks

Symplectic convolutional neural networks

Built with on top of