John Hopfield's 1982 paper introduced a recurrent neural network model that stores memories as stable attractor states. The Hopfield network demonstrated that simple neural dynamics could implement content-addressable (associative) memory, where presenting a partial or noisy pattern retrieves the complete stored pattern.
Weight learning (Hebbian): wᵢⱼ = (1/N) Σᵘ ξᵢᵘ ξⱼᵘ
Update rule: sᵢ → sign(Σⱼ wᵢⱼ sⱼ)
Memory as Energy Minimization
Each stored pattern corresponds to a local minimum of the energy function. Starting from any initial state, asynchronous updates of individual neurons always decrease the energy (or leave it unchanged), guaranteeing convergence to a local minimum. This attractor dynamics provides a natural model of memory completion: a noisy or partial cue flows "downhill" to the nearest stored pattern.
Capacity and Limitations
The maximum number of patterns that can be reliably stored is approximately 0.14N, where N is the number of neurons. Beyond this capacity, spurious minima (blends of stored patterns) proliferate. Modern extensions, including continuous Hopfield networks and dense associative memories with higher-order interactions, dramatically increase storage capacity and have recently been connected to the attention mechanisms in transformer architectures.