The field of partial differential equations (PDEs) is witnessing significant advancements with the development of neural operators. These models have shown great promise in improving the efficiency and accuracy of PDE solutions, particularly in complex systems such as subsurface reservoir systems and fluid dynamics. The current trend is towards designing neural operators that can capture both global and local structures of PDEs, allowing for more accurate and stable solutions. Techniques such as spectral coupling, latent shape pretraining, and local stencils are being explored to enhance the performance of neural operators. Furthermore, there is a growing interest in developing lightweight and parameter-efficient foundation models for PDEs, which can be easily fine-tuned for various downstream tasks. Notable papers in this area include:
- Neural Operators for Mathematical Modeling of Transient Fluid Flow in Subsurface Reservoir Systems, which proposes a neural operator architecture for modeling transient fluid flow in subsurface reservoir systems, achieving a six-order magnitude acceleration in calculations compared to traditional methods.
- DRIFT-Net: A Spectral--Coupled Neural Operator for PDEs Learning, which introduces a dual-branch design for capturing global and local structures of PDEs, resulting in lower error and higher throughput with fewer parameters.
- From Cheap Geometry to Expensive Physics: Elevating Neural Operators via Latent Shape Pretraining, which presents a two-stage framework for improving supervised operator learning using latent shape pretraining, consistently improving prediction accuracy across four PDE datasets.
- PDE Solvers Should Be Local: Fast, Stable Rollouts with Learned Local Stencils, which proposes a finite-difference-inspired neural architecture that enforces strict locality, achieving up to 44% lower error and up to 2x speedups over state-of-the-art operator-learning baselines.
- SPUS: A Lightweight and Parameter-Efficient Foundation Model for PDEs, which introduces a compact and efficient foundation model designed as a unified neural operator for solving a wide range of PDEs, achieving state-of-the-art generalization while requiring significantly fewer parameters and minimal fine-tuning data.