jax.nn.relu#
- jax.nn.relu(x) = <jax._src.custom_derivatives.custom_jvp object>[source]#
Rectified linear unit activation function.
Computes the element-wise function:
\[\mathrm{relu}(x) = \max(x, 0)\]except under differentiation, we take:
\[\nabla \mathrm{relu}(0) = 0\]For more information see Numerical influence of ReLU’(0) on backpropagation.