Neural Tangent Kernel Advances

The field of neural networks is witnessing significant developments with the Neural Tangent Kernel (NTK) at its forefront. Recent research has focused on improving the efficiency and effectiveness of NTK-based methods, enabling faster analysis and applications. A key direction is the development of matrix-free approaches, which can yield speedups of many orders of magnitude. Another area of innovation is the use of NTK-guided methods for accelerating training and improving representation quality. Noteworthy papers include: NTK-Guided Implicit Neural Teaching, which proposes a novel approach for accelerating training by dynamically selecting coordinates that maximize global functional updates. Convergence and Sketching-Based Efficient Computation of Neural Tangent Kernel Weights in Physics-Based Loss, which proves the convergence of an adaptive weighting algorithm and develops a randomized algorithm for efficient computation of NTK-based weights.

Sources

Fast Neural Tangent Kernel Alignment, Norm and Effective Rank via Trace Estimation

Derivative of the truncated singular value and eigen decomposition

NTK-Guided Implicit Neural Teaching

Convergence and Sketching-Based Efficient Computation of Neural Tangent Kernel Weights in Physics-Based Loss

Built with on top of