Donald Hebb proposed in his 1949 book The Organization of Behavior that synaptic connections between neurons are strengthened when both neurons are simultaneously active. This simple principle — often paraphrased as "cells that fire together wire together" — has become the foundation for understanding both biological synaptic plasticity and artificial learning rules.
wᵢⱼ = synaptic weight from neuron i to neuron j
xᵢ, xⱼ = activations of pre- and post-synaptic neurons
η = learning rate
Biological Basis
Hebbian learning received dramatic biological support with the discovery of long-term potentiation (LTP) by Bliss and Lømo in 1973. LTP shows that high-frequency stimulation of a synapse produces a lasting increase in synaptic strength — precisely the co-activation dependent strengthening that Hebb predicted. The NMDA receptor has been identified as a molecular coincidence detector that implements Hebbian synaptic modification.
Mathematical Extensions
The basic Hebbian rule has a fundamental problem: weights grow without bound. This has led to numerous extensions including Oja's rule (which normalizes the weight vector), BCM theory (which introduces a sliding threshold between potentiation and depression), and covariance learning rules. These extensions connect Hebbian learning to principal component analysis and competitive learning, providing mathematical rigor while preserving the core biological insight.