Advances in Physics-Informed Neural Networks for Scientific Computing

The field of scientific computing is witnessing a significant shift towards the adoption of physics-informed neural networks (PINNs) for solving partial differential equations (PDEs) and other complex problems. Recent research has focused on developing innovative architectures and training methods that can effectively incorporate physical constraints and laws into neural networks, leading to improved accuracy and generalizability. One notable direction is the use of geometric and physical constraints to enhance the performance of neural PDE surrogates, allowing for more accurate predictions and better generalization to new initial conditions and durations. Another area of research is the development of novel training methods, such as projection-based frameworks and flow matching techniques, which can provide more efficient and effective ways of training neural networks for scientific computing applications. Noteworthy papers in this area include 'Geometric and Physical Constraints Synergistically Enhance Neural PDE Surrogates', which introduces novel input and output layers that respect physical laws and symmetries on staggered grids, and 'Flow Matching Meets PDEs: A Unified Framework for Physics-Constrained Generation', which proposes a novel generative framework that explicitly embeds physical constraints into the flow matching objective. Overall, the field is moving towards the development of more powerful and flexible neural network architectures that can effectively capture the underlying physics of complex systems, leading to improved predictions and better decision-making in a wide range of scientific and engineering applications.

Sources

Geometric and Physical Constraints Synergistically Enhance Neural PDE Surrogates

A projection-based framework for gradient-free and parallel learning

ENMA: Tokenwise Autoregression for Generative Neural PDE Operators

Mondrian: Transformer Operators via Domain Decomposition

Universal Differential Equations for Scientific Machine Learning of Node-Wise Battery Dynamics in Smart Grids

Thermodynamically Consistent Latent Dynamics Identification for Parametric Systems

Flow Matching Meets PDEs: A Unified Framework for Physics-Constrained Generation

A PDE-Based Image Dehazing Method via Atmospheric Scattering Theory

Scalable Spatiotemporal Inference with Biased Scan Attention Transformer Neural Processes

mLaSDI: Multi-stage latent space dynamics identification

Geometric flow regularization in latent spaces for smooth dynamics with the efficient variations of curvature

Generative Models for Parameter Space Reduction applied to Reduced Order Modelling

Structure and asymptotic preserving deep neural surrogates for uncertainty quantification in multiscale kinetic equations

PDESpectralRefiner: Achieving More Accurate Long Rollouts with Spectral Adjustment

Principled Approaches for Extending Neural Architectures to Function Spaces for Operator Learning

Built with on top of