Symmetry in Optimization and Neural Networks

The field of optimization and neural networks is witnessing a significant shift towards understanding and exploiting symmetry in various contexts. Researchers are investigating the role of symmetry in optimization landscapes, neural network architectures, and equivariant models. A key finding is that symmetry is ubiquitous in critical points across diverse optimization landscapes, and this phenomenon can be leveraged to improve the efficiency of neural networks. The concept of exchangeability is being explored as a means to identify and reduce redundancy in neural networks, leading to the development of dynamic pruning algorithms that can significantly reduce computational costs without compromising accuracy. Furthermore, the loss landscape geometry of equivariant models is being analyzed, revealing that parameter symmetries can have non-trivial effects on the optimization process. Noteworthy papers in this area include:

  • Ubiquitous Symmetry at Critical Points Across Diverse Optimization Landscapes, which introduces a new measure of symmetry and reveals additional symmetry structures not captured by previous measures.
  • Exchangeability in Neural Network Architectures and its Application to Dynamic Pruning, which derives a principled dynamic pruning algorithm that exploits symmetry-induced redundancy.
  • Learning equivariant models by discovering symmetries with learnable augmentations, which proposes an end-to-end approach to jointly discover and encode symmetries in data.

Sources

Ubiquitous Symmetry at Critical Points Across Diverse Optimization Landscapes

Exchangeability in Neural Network Architectures and its Application to Dynamic Pruning

A Tale of Two Symmetries: Exploring the Loss Landscape of Equivariant Models

On Universality Classes of Equivariant Networks

Breaking Symmetries with Involutions

Learning equivariant models by discovering symmetries with learnable augmentations

Built with on top of