Importance of Activation Functions
The purpose of activation functions is to introduce non-linearities into the network. Non-linear function allows to approximate arbitrarily complex functions that make neural network extremely powerful.
For example, when a trained network with weights W and the network has only two inputs- x1 , x2 and we pass it through a non-linearity.
Before applying non-linearity, If we feed in with a new input x1= -1 and x2= 2, the idea can be generalized a little bit more if we compute the line, we get minus -6.
When we apply a sigmoid non-linearity, it collapses between 0 and 1. Sigmoid function results anything greater than 1 as above 0.5 and anything less than 1 as below 0.5.
The reason why we use non-linear activation function is when deal with network thousands or millions of parameters and dimensional spaces then visualizing these type of plots becomes extremely difficult.
Comments
Post a Comment