The field of dimensionality reduction and noise reduction is moving towards the development of more sophisticated geometric and probabilistic methods. Researchers are exploring new approaches that incorporate nonlinear manifold geometry and probabilistic models to better describe complex data distributions. These methods aim to improve the robustness and interpretability of dimensionality reduction techniques, such as PCA, and to enhance the accuracy of noise reduction in high-dimensional data. Notable advancements include the development of probabilistic geometric principal component analysis and geometric integration for neural control variates. Noteworthy papers include: Random Matrix Theory-guided sparse PCA for single-cell RNA-seq data, which improves upon traditional PCA with a mathematically grounded approach. Probabilistic Geometric Principal Component Analysis with application to neural data, which generalizes PPCA to incorporate nonlinear manifold geometry.