The field of solving Partial Differential Equations (PDEs) is moving towards the development of more efficient and accurate methods, particularly those that can handle large-scale and high-dimensional problems. Recent innovations have focused on incorporating physical laws and constraints into machine learning models, such as neural networks and Gaussian processes, to improve their performance and robustness. These approaches have shown promising results in reducing computational complexity and improving prediction accuracy. Notably, the use of tensor decomposition and functional tensor train representations has enabled the solution of high-dimensional PDEs on non-uniform grids and irregular domains. Additionally, the incorporation of physical laws through regularization schemes has improved the fidelity and robustness of spatiotemporal dynamic models. Some noteworthy papers in this area include: LRQ-Solver, which introduces a transformer-based neural operator for fast and accurate solving of large-scale 3D PDEs, achieving a 38.9% error reduction on the DrivAer++ dataset. Functional tensor train neural network, which develops a functional tensor train neural network for solving high-dimensional PDEs on non-uniform grids or irregular domains, outperforming Physics Informed Neural Networks. Tensor Gaussian Processes, which proposes a tensor-GP-based solver that models factor functions along each input dimension using one-dimensional GPs, enabling scalability to massive collocation sets and achieving superior accuracy and efficiency compared to existing approaches.