The field is moving towards the development of efficient tensor-based methods for computing complex problems. These methods enable scalable and memory-efficient computation, allowing for the solution of high-dimensional problems that were previously infeasible. Notably, the use of tensor-train representations and low-rank approximations is becoming increasingly popular, as it offers a robust and highly efficient alternative to traditional approaches. This trend is expected to continue, with potential applications in various areas such as modal analysis, model reduction, and control design. Noteworthy papers include: Efficient High-Order Participation Factor Computation via Batch-Structured Tensor Contraction, which presents an efficient tensor-based method for calculating high-order participation factors. A Low-Rank tensor framework for THB-Splines, which introduces a low-rank framework for adaptive isogeometric analysis with truncated hierarchical B-splines. The low-rank tensor-train finite difference method for three-dimensional parabolic equations, which presents a numerical framework for the low-rank approximation of the solution to three-dimensional parabolic problems. A Tensor Train-Based Isogeometric Solver for Large-Scale 3D Poisson Problems on Complex Geometries, which introduces a three-dimensional fully tensor train-assembled isogeometric analysis framework for solving partial differential equations on complex geometries.