Distributed Computation and Local Learning in Neural Networks

The field of neural networks is moving towards more biologically plausible and distributed computing models. Researchers are exploring new algorithms and techniques that enable local learning and computation, reducing the need for global state and backpropagation. This shift is driven by the need for more efficient and scalable models that can be applied to real-world problems. Noteworthy papers in this area include:

  • Predictive Spike Timing Enables Distributed Shortest Path Computation in Spiking Neural Networks, which proposes a biologically plausible algorithm for shortest-path computation.
  • Adaptive Spatial Goodness Encoding: Advancing and Scaling Forward-Forward Learning Without Backpropagation, which introduces a new training framework that decouples classification complexity from channel dimensionality.
  • Traces Propagation: Memory-Efficient and Scalable Forward-Only Learning in Spiking Neural Networks, which presents a forward-only learning rule that combines eligibility traces with a layer-wise contrastive loss.

Sources

Predictive Spike Timing Enables Distributed Shortest Path Computation in Spiking Neural Networks

Adaptive Spatial Goodness Encoding: Advancing and Scaling Forward-Forward Learning Without Backpropagation

A Neuromorphic Model of Learning Meaningful Sequences with Long-Term Memory

Traces Propagation: Memory-Efficient and Scalable Forward-Only Learning in Spiking Neural Networks

Built with on top of