jax.nn
module#
Common functions for neural network libraries.
Activation functions#
|
Rectified linear unit activation function. |
|
Rectified Linear Unit 6 activation function. |
|
Sigmoid activation function. |
|
Softplus activation function. |
|
Soft-sign activation function. |
|
SiLU (a.k.a. |
|
SiLU (a.k.a. |
|
Log-sigmoid activation function. |
|
Leaky rectified linear unit activation function. |
|
Hard Sigmoid activation function. |
|
Hard SiLU (swish) activation function |
|
Hard SiLU (swish) activation function |
|
Hard \(\mathrm{tanh}\) activation function. |
|
Exponential linear unit activation function. |
|
Continuously-differentiable exponential linear unit activation. |
|
Scaled exponential linear unit activation. |
|
Gaussian error linear unit activation function. |
|
Gated linear unit activation function. |
Other functions#
|
Softmax function. |
|
Log-Softmax function. |
|
Compute the log of the sum of exponentials of input elements. |
|
Normalizes an array by subtracting |
|
One-hot encodes the given indices. |