Foundations Of Cognitive Science

Batch Training

Batch training is one approach to training a network using the generalized delta rule.  In batch training, error is accumulated error over an epoch (where an epoch is a presentation of each training pattern); weights are only updated once, at the end of the epoch, using accumulated error (Rumelhart et al., 1986).  This can speed up the learning algorithm (because patterns do not have to be randomized each sweep), but is not psychologically plausible, because batch training requires knowing when an epoch is finished (and therefore knowing when to modify weights).


  1. Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning internal representations by error propagation. In D. E. Rumelhart & G. E. Hinton (Eds.), Parallel Distributed Processing (Vol. 1, pp. 318-362). Cambridge, MA: MIT Press.

(Added April 2011)