The field of partial differential equations (PDEs) is experiencing a significant shift with the integration of machine learning (ML) techniques. Recent developments have focused on improving the accuracy and generalizability of ML models for solving PDEs, particularly in situations where traditional methods struggle. A key direction is the incorporation of physical constraints and properties of PDEs into ML architectures, enabling these models to better capture complex phenomena and generalize across different scales and domains. Another area of innovation involves the development of novel neural network architectures and training strategies that can effectively handle the challenges posed by PDEs, such as scale consistency and boundary conditions. These advancements have the potential to revolutionize the solution of PDEs in various fields, including fluid dynamics, physics, and engineering. Noteworthy papers include: Scale-Consistent Learning for Partial Differential Equations, which proposes a data augmentation scheme based on scale-consistency properties of PDEs. Geometric Operator Learning with Optimal Transport, which integrates optimal transport into operator learning for PDEs on complex geometries. AI paradigm for solving differential equations, which introduces a reversible scale-dilation operator AI solver that leverages the Fourier transform of multiscale solutions.