ReLU, which stands for Rectified Linear Unit, is a non-linear activation function commonly used in deep neural networks and machine learning models. It can be mathematically represented as

Where is the input value.