Let’s start with the basics why would we even need an activation function and what is it >
Source : Activation Functions : Sigmoid, ReLU, Leaky ReLU and Softmax basics for Neural Networks and Deep…
Let’s start with the basics why would we even need an activation function and what is it >
Source : Activation Functions : Sigmoid, ReLU, Leaky ReLU and Softmax basics for Neural Networks and Deep…