Kinetic Theory and Neural Networks

The field of kinetic theory and neural networks is moving towards the development of more efficient and accurate methods for modeling complex systems. Researchers are exploring new ways to combine physical intuition with machine learning techniques to improve the scalability and generalizability of models. One notable direction is the use of geometric and differential-geometric approaches to construct neural networks that preserve fundamental physical properties, such as energy conservation laws. Another area of focus is the development of novel numerical methods, such as micro-macro kinetic flux-vector splitting schemes, to solve kinetic equations more efficiently. Noteworthy papers include:

  • Fast-Forward Lattice Boltzmann: Learning Kinetic Behaviour with Physics-Informed Neural Operators, which introduces a physics-informed neural operator framework for the lattice Boltzmann equation.
  • Learning Hamiltonian Dynamics at Scale: A Differential-Geometric Approach, which presents a novel physics-inspired neural network that combines conservation laws of Hamiltonian mechanics with model order reduction.

Sources

Micro-macro kinetic flux-vector splitting schemes for the multidimensional Boltzmann-ES-BGK equation

Fast-Forward Lattice Boltzmann: Learning Kinetic Behaviour with Physics-Informed Neural Operators

Learning Hamiltonian Dynamics at Scale: A Differential-Geometric Approach

A Hamiltonian driven Geometric Construction of Neural Networks on the Lognormal Statistical Manifold

Neural non-canonical Hamiltonian dynamics for long-time simulations

Built with on top of