nerva_jax.loss_functions
Analytic loss functions and their gradients used during training.
Functions are provided in vector (lowercase) and matrix (uppercase) forms. Concrete LossFunction classes wrap these for use in the training loop.
Functions
|
Cross entropy loss for matrices: -sum(T ⊙ log(Y)). |
Gradient of cross entropy loss for matrices. |
|
Logistic cross entropy loss for matrices. |
|
Gradient of logistic cross entropy loss for matrices. |
|
Negative log likelihood loss for matrices. |
|
Gradient of negative log likelihood loss for matrices. |
|
Softmax cross entropy loss for matrices. |
|
Gradient of softmax cross entropy loss for matrices. |
|
Gradient of softmax cross entropy for one-hot targets (matrices). |
|
|
Squared error loss for matrices: sum of ||Y - T||². |
Gradient of squared error loss for matrices. |
|
Stable softmax cross entropy loss for matrices. |
|
Gradient of stable softmax cross entropy loss for matrices. |
|
Gradient of stable softmax cross entropy for one-hot targets (matrices). |
|
|
Cross entropy loss for vectors: -t^T log(y). |
Gradient of cross entropy loss for vectors. |
|
Logistic cross entropy loss for vectors. |
|
Gradient of logistic cross entropy loss for vectors. |
|
Negative log likelihood loss for vectors. |
|
Gradient of negative log likelihood loss for vectors. |
|
|
Parse a loss function name into a LossFunction instance. |
Softmax cross entropy loss for vectors. |
|
Gradient of softmax cross entropy loss for vectors. |
|
Gradient of softmax cross entropy for one-hot targets. |
|
|
Squared error loss for vectors: ||y - t||². |
Gradient of squared error loss for vectors. |
|
Stable softmax cross entropy loss for vectors. |
|
Gradient of stable softmax cross entropy loss for vectors. |
|
Gradient of stable softmax cross entropy for one-hot targets. |
Classes
Cross entropy loss function for classification with probabilities. |
|
Logistic cross entropy loss for binary classification. |
|
Interface for loss functions with value and gradient on batch matrices. |
|
Negative log likelihood loss for probabilistic outputs. |
|
Softmax cross entropy loss for classification with logits. |
|
Squared error loss function for regression tasks. |
|
Numerically stable softmax cross entropy loss for classification. |
- nerva_jax.loss_functions.squared_error_loss(y, t)[source]
Squared error loss for vectors: ||y - t||².
- nerva_jax.loss_functions.squared_error_loss_gradient(y, t)[source]
Gradient of squared error loss for vectors.
- nerva_jax.loss_functions.Squared_error_loss(Y, T)[source]
Squared error loss for matrices: sum of ||Y - T||².
- nerva_jax.loss_functions.Squared_error_loss_gradient(Y, T)[source]
Gradient of squared error loss for matrices.
- nerva_jax.loss_functions.cross_entropy_loss(y, t)[source]
Cross entropy loss for vectors: -t^T log(y).
- nerva_jax.loss_functions.cross_entropy_loss_gradient(y, t)[source]
Gradient of cross entropy loss for vectors.
- nerva_jax.loss_functions.Cross_entropy_loss(Y, T)[source]
Cross entropy loss for matrices: -sum(T ⊙ log(Y)).
- nerva_jax.loss_functions.Cross_entropy_loss_gradient(Y, T)[source]
Gradient of cross entropy loss for matrices.
- nerva_jax.loss_functions.softmax_cross_entropy_loss(y, t)[source]
Softmax cross entropy loss for vectors.
- nerva_jax.loss_functions.softmax_cross_entropy_loss_gradient(y, t)[source]
Gradient of softmax cross entropy loss for vectors.
- nerva_jax.loss_functions.softmax_cross_entropy_loss_gradient_one_hot(y, t)[source]
Gradient of softmax cross entropy for one-hot targets.
- nerva_jax.loss_functions.Softmax_cross_entropy_loss(Y, T)[source]
Softmax cross entropy loss for matrices.
- nerva_jax.loss_functions.Softmax_cross_entropy_loss_gradient(Y, T)[source]
Gradient of softmax cross entropy loss for matrices.
- nerva_jax.loss_functions.Softmax_cross_entropy_loss_gradient_one_hot(Y, T)[source]
Gradient of softmax cross entropy for one-hot targets (matrices).
- nerva_jax.loss_functions.stable_softmax_cross_entropy_loss(y, t)[source]
Stable softmax cross entropy loss for vectors.
- nerva_jax.loss_functions.stable_softmax_cross_entropy_loss_gradient(y, t)[source]
Gradient of stable softmax cross entropy loss for vectors.
- nerva_jax.loss_functions.stable_softmax_cross_entropy_loss_gradient_one_hot(y, t)[source]
Gradient of stable softmax cross entropy for one-hot targets.
- nerva_jax.loss_functions.Stable_softmax_cross_entropy_loss(Y, T)[source]
Stable softmax cross entropy loss for matrices.
- nerva_jax.loss_functions.Stable_softmax_cross_entropy_loss_gradient(Y, T)[source]
Gradient of stable softmax cross entropy loss for matrices.
- nerva_jax.loss_functions.Stable_softmax_cross_entropy_loss_gradient_one_hot(Y, T)[source]
Gradient of stable softmax cross entropy for one-hot targets (matrices).
- nerva_jax.loss_functions.logistic_cross_entropy_loss(y, t)[source]
Logistic cross entropy loss for vectors.
- nerva_jax.loss_functions.logistic_cross_entropy_loss_gradient(y, t)[source]
Gradient of logistic cross entropy loss for vectors.
- nerva_jax.loss_functions.Logistic_cross_entropy_loss(Y, T)[source]
Logistic cross entropy loss for matrices.
- nerva_jax.loss_functions.Logistic_cross_entropy_loss_gradient(Y, T)[source]
Gradient of logistic cross entropy loss for matrices.
- nerva_jax.loss_functions.negative_log_likelihood_loss(y, t)[source]
Negative log likelihood loss for vectors.
- nerva_jax.loss_functions.negative_log_likelihood_loss_gradient(y, t)[source]
Gradient of negative log likelihood loss for vectors.
- nerva_jax.loss_functions.Negative_log_likelihood_loss(Y, T)[source]
Negative log likelihood loss for matrices.
- nerva_jax.loss_functions.Negative_log_likelihood_loss_gradient(Y, T)[source]
Gradient of negative log likelihood loss for matrices.
- class nerva_jax.loss_functions.LossFunction[source]
Bases:
object
Interface for loss functions with value and gradient on batch matrices.
- class nerva_jax.loss_functions.SquaredErrorLossFunction[source]
Bases:
LossFunction
Squared error loss function for regression tasks.
- class nerva_jax.loss_functions.CrossEntropyLossFunction[source]
Bases:
LossFunction
Cross entropy loss function for classification with probabilities.
- class nerva_jax.loss_functions.SoftmaxCrossEntropyLossFunction[source]
Bases:
LossFunction
Softmax cross entropy loss for classification with logits.
- class nerva_jax.loss_functions.StableSoftmaxCrossEntropyLossFunction[source]
Bases:
LossFunction
Numerically stable softmax cross entropy loss for classification.
- class nerva_jax.loss_functions.LogisticCrossEntropyLossFunction[source]
Bases:
LossFunction
Logistic cross entropy loss for binary classification.
- class nerva_jax.loss_functions.NegativeLogLikelihoodLossFunction[source]
Bases:
LossFunction
Negative log likelihood loss for probabilistic outputs.
- nerva_jax.loss_functions.parse_loss_function(text: str) LossFunction [source]
Parse a loss function name into a LossFunction instance.
Supported names: SquaredError, CrossEntropy, SoftmaxCrossEntropy, LogisticCrossEntropy, NegativeLogLikelihood.