


One continuous approximation of the Heaviside step function is the sigmoidshaped logistic function. The equation of this function is f(net_{i}) = 1 / (1 + exp (net_{i} + bias_{j})). It asymptotes to a value of 0 as net input approaches negative infinity, and asymptotes to a value of 1 as net input approaches positive infinity. When net input is equal to the threshold (or bias) of the logistic, activity is equal to 0.5. Because the logistic function is continuous, its derivative can be calculated, and calculus can be used as a tool to derive new learning rules (Rumelhart et al., 1986). However, it is still nonlinear, so logistic activities can still be interpreted as truth values assigned to propositions. The logistic function is the activation function that is typically used to define the integration devices that make up a modern multilayered perceptron.
References:
 Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning representations by backpropagating errors. Nature, 323, 533536.
(Added April 2011)



