Advances in Automatic Differentiation and Sparse Matrix Computations

The field of automatic differentiation and sparse matrix computations is witnessing significant developments, driven by the need for efficient and modular solutions. Recent work has focused on creating common interfaces for automatic differentiation systems, enabling easy comparison and modular development. Additionally, innovative methods for computing Jacobian matrices, sparse matrix coloring, and bicoloring have been proposed, offering improved performance and reduced computational burdens. The introduction of new strategies for matrix inversion, such as dyadic factorization and sparse Gram-Schmidt orthogonalization, has also shown promise. These advancements have the potential to impact various applications, including scientific machine learning and large-scale sparse matrix computations. Noteworthy papers include: A Common Interface for Automatic Differentiation, which provides a common frontend to multiple AD backends, and Dyadic Factorization and Efficient Inversion of Sparse Positive Definite Matrices, which proposes an efficient approach for inverting sparse matrices using dyadic factorization.

Sources

A Common Interface for Automatic Differentiation

Scheduled Jacobian Chaining

Revisiting Sparse Matrix Coloring and Bicoloring

Dyadic Factorization and Efficient Inversion of Sparse Positive Definite Matrices

Built with on top of