•Why won’t these sequences add power?
•It is because unit activation is a linear function of
net input
•For layers to add something that can’t be removed by
linear algebra, a nonlinear transformation of net
input must be provided
•In short, we need to use a nonlinear activation
function in our processors
•Fortunately, many are available