There are a variety of different ways in which the generic backpropagation of error algorithm can be realized. For instance, in __stochastic training__, connection weights are updated after each pattern is presented (Dawson, 2004). This approach is called stochastic because each pattern is presented once per epoch of training, but the order of presentation is randomized each epoch. This slows a training program down, because the program has to run a randomization algorithm every epoch. However, it is psychologically plausible, in the sense that learning occurs after each pattern is encountered, and learning can be affected by the order of pattern presentation.

**References:**

- Dawson, M. R. W. (2004).
*Minds And Machines: Connectionism And Psychological Modeling*. Malden, MA: Blackwell Pub.

(Added April 2011)