nerva_numpy.activation_functions

Activation functions and utilities used by the MLP implementation.

This module provides simple callable classes for common activations and a parser that turns textual specifications into activation instances (e.g. “ReLU”, “LeakyReLU(alpha=0.1)”, “SReLU(al=0, tl=0, ar=0, tr=1)”).

Functions

All_relu(alpha)

AllReLU factory.

All_relu_gradient(alpha)

Gradient factory for AllReLU.

Hyperbolic_tangent(X)

Hyperbolic tangent activation.

Hyperbolic_tangent_gradient(X)

Gradient of tanh: 1 - tanh²(X).

Leaky_relu(alpha)

Leaky ReLU factory: max(X, alpha * X).

Leaky_relu_gradient(alpha)

Gradient factory for leaky ReLU.

Relu(X)

Rectified linear unit activation: max(0, X).

Relu_gradient(X)

Gradient of ReLU: 1 where X > 0, 0 elsewhere.

Sigmoid(X)

Sigmoid activation: 1 / (1 + exp(-X)).

Sigmoid_gradient(X)

Gradient of sigmoid: σ(X) * (1 - σ(X)).

Srelu(al, tl, ar, tr)

SReLU factory: smooth rectified linear with learnable parameters.

Srelu_gradient(al, tl, ar, tr)

Gradient factory for SReLU.

parse_activation(text)

Parse a textual activation specification into an ActivationFunction.

Classes

ActivationFunction()

Interface for activation functions with value and gradient methods.

AllReLUActivation(alpha)

AllReLU activation (alternative parameterization of leaky ReLU).

HyperbolicTangentActivation()

Hyperbolic tangent activation function.

LeakyReLUActivation(alpha)

Leaky ReLU activation: max(x, alpha * x).

ReLUActivation()

ReLU activation function: max(0, x).

SReLUActivation([al, tl, ar, tr])

Smooth rectified linear activation with learnable parameters.

SigmoidActivation()

Sigmoid activation function: 1 / (1 + exp(-x)).

nerva_numpy.activation_functions.Relu(X: numpy.ndarray) numpy.ndarray[source]

Rectified linear unit activation: max(0, X).

nerva_numpy.activation_functions.Relu_gradient(X: numpy.ndarray) numpy.ndarray[source]

Gradient of ReLU: 1 where X > 0, 0 elsewhere.

nerva_numpy.activation_functions.Leaky_relu(alpha)[source]

Leaky ReLU factory: max(X, alpha * X).

nerva_numpy.activation_functions.Leaky_relu_gradient(alpha)[source]

Gradient factory for leaky ReLU.

nerva_numpy.activation_functions.All_relu(alpha)[source]

AllReLU factory.

nerva_numpy.activation_functions.All_relu_gradient(alpha)[source]

Gradient factory for AllReLU.

nerva_numpy.activation_functions.Hyperbolic_tangent(X: numpy.ndarray) numpy.ndarray[source]

Hyperbolic tangent activation.

nerva_numpy.activation_functions.Hyperbolic_tangent_gradient(X: numpy.ndarray) numpy.ndarray[source]

Gradient of tanh: 1 - tanh²(X).

nerva_numpy.activation_functions.Sigmoid(X: numpy.ndarray) numpy.ndarray[source]

Sigmoid activation: 1 / (1 + exp(-X)).

nerva_numpy.activation_functions.Sigmoid_gradient(X: numpy.ndarray) numpy.ndarray[source]

Gradient of sigmoid: σ(X) * (1 - σ(X)).

nerva_numpy.activation_functions.Srelu(al, tl, ar, tr)[source]

SReLU factory: smooth rectified linear with learnable parameters.

nerva_numpy.activation_functions.Srelu_gradient(al, tl, ar, tr)[source]

Gradient factory for SReLU.

class nerva_numpy.activation_functions.ActivationFunction[source]

Bases: object

Interface for activation functions with value and gradient methods.

gradient(X: numpy.ndarray) numpy.ndarray[source]
class nerva_numpy.activation_functions.ReLUActivation[source]

Bases: ActivationFunction

ReLU activation function: max(0, x).

gradient(X: numpy.ndarray) numpy.ndarray[source]

Compute gradient of ReLU.

class nerva_numpy.activation_functions.LeakyReLUActivation(alpha)[source]

Bases: ActivationFunction

Leaky ReLU activation: max(x, alpha * x).

gradient(X: numpy.ndarray) numpy.ndarray[source]

Compute gradient of leaky ReLU.

class nerva_numpy.activation_functions.AllReLUActivation(alpha)[source]

Bases: ActivationFunction

AllReLU activation (alternative parameterization of leaky ReLU).

gradient(X: numpy.ndarray) numpy.ndarray[source]

Compute gradient of AllReLU.

class nerva_numpy.activation_functions.HyperbolicTangentActivation[source]

Bases: ActivationFunction

Hyperbolic tangent activation function.

gradient(X: numpy.ndarray) numpy.ndarray[source]

Compute gradient of hyperbolic tangent.

class nerva_numpy.activation_functions.SigmoidActivation[source]

Bases: ActivationFunction

Sigmoid activation function: 1 / (1 + exp(-x)).

gradient(X: numpy.ndarray) numpy.ndarray[source]

Compute gradient of sigmoid.

class nerva_numpy.activation_functions.SReLUActivation(al=0.0, tl=0.0, ar=0.0, tr=1.0)[source]

Bases: ActivationFunction

Smooth rectified linear activation with learnable parameters.

gradient(X: numpy.ndarray) numpy.ndarray[source]

Compute gradient of SReLU with current parameters.

nerva_numpy.activation_functions.parse_activation(text: str) ActivationFunction[source]

Parse a textual activation specification into an ActivationFunction.

Examples include “ReLU”, “Sigmoid”, “HyperbolicTangent”, “AllReLU(alpha=0.1)”, “LeakyReLU(alpha=0.1)”, and “SReLU(al=0, tl=0, ar=0, tr=1)”.