ReLU, which stands for Rectified Linear Unit, is a non-linear activation function commonly used in deep neural networks and machine learning models. It can be mathematically represented as
Where is the input value.
ReLU, which stands for Rectified Linear Unit, is a non-linear activation function commonly used in deep neural networks and machine learning models. It can be mathematically represented as
f(x)=max(0,x)Where x is the input value.