Tensor-Based Methods for Efficient Computing with Uncertainty

The field of uncertainty quantification and stochastic modeling is experiencing a significant shift towards tensor-based methods, which offer a promising solution to the curse of dimensionality. These methods enable efficient computing with high-dimensional probability distributions, allowing for exact arithmetic operations and deterministic computations. The development of tensor train approaches, mode-aware non-linear Tucker autoencoders, and tensor-based dynamic mode decomposition are notable examples of innovative techniques that advance the field. Noteworthy papers include: A Tensor Train Approach for Deterministic Arithmetic Operations on Discrete Representations of Probability Distributions, which presents an efficient tensor train method for performing exact arithmetic operations on discretizations of continuous probability distributions. Mode-Aware Non-Linear Tucker Autoencoder for Tensor-based Unsupervised Learning, which introduces a non-linear framework for tensor decomposition and exhibits linear growth in computational complexity with tensor order.

Sources

A Tensor Train Approach for Deterministic Arithmetic Operations on Discrete Representations of Probability Distributions

Mode-Aware Non-Linear Tucker Autoencoder for Tensor-based Unsupervised Learning

The Monte Carlo Method and New Device and Architectural Techniques for Accelerating It

A Lightweight Learned Cardinality Estimation Model

TensorKit.jl: A Julia package for large-scale tensor computations, with a hint of category theory

A tensor-based dynamic mode decomposition based on the $\star_{\boldsymbol{M}}$-product

Built with on top of