Foundations Of Cognitive Science

Hopfield Network

A Hopfield network, created by physicist John Hopfield (1982) is an autoassociative network that has a single set of processing units that are connected to each other.  Usually the processing units have a nonlinear activation function, like the step function, although the activation function can also be continuous (Hopfield, 1984).  The network can be trained on a set of patterns using the Hebb rule.  Each of these trained patterns inserts a local minimum into an “energy space” for the network.  Later, when distorted patterns are presented to the network, it changes the values of its units to move the network into one of these local minima, and then stops changing.  This represents a recalled pattern.  Hopfield networks are interesting because they take time to make responses, and therefore can be used to model psychological tasks in which response latency is a key dependent measure.  Also, they represent a kind of network that can be used to complete (or clean up) distorted patterns, and thus demonstrate one of the putative connectionist advantages, graceful degradation.


  1. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the National Academy of Sciences, 79, 2554-2558.
  2. Hopfield, J. J. (1984). Neurons with graded response have collective computational properties like those of two state neurons. Proceedings of the National Academy of Sciences USA, 81, 3008-3092.

(Added October 2009)