nerva_torch.loss_functions_torch
Thin wrappers around PyTorch loss modules for comparison and testing.
Functions
Computes the negative likelihood loss between Y and T. |
|
Computes the softmax cross entropy loss between Y and T. |
|
|
Computes the squared error loss between Y and T. |
- nerva_torch.loss_functions_torch.squared_error_loss_torch(Y, T)[source]
Computes the squared error loss between Y and T.
Parameters: Y (torch.Tensor): The predicted values. T (torch.Tensor): The target values.
Returns: float: The computed loss.
- nerva_torch.loss_functions_torch.softmax_cross_entropy_loss_torch(Y, T)[source]
Computes the softmax cross entropy loss between Y and T.
Parameters: Y (torch.Tensor): The predicted values. T (torch.Tensor): The target values.
Returns: float: The computed loss.
- nerva_torch.loss_functions_torch.negative_likelihood_loss_torch(Y, T)[source]
Computes the negative likelihood loss between Y and T. Note that PyTorch does not apply the log function, since it assumes Y is the output of a log softmax layer. For this reason we omit “log” in the name.
Parameters: Y (torch.Tensor): The predicted values. T (torch.Tensor): The target values.
Returns: float: The computed loss.