nerva_numpy documentation

A tiny, educational set of neural network components built on JAX.

Install and build

# from repository root
python -m pip install -U sphinx sphinx-rtd-theme
# build HTML docs into docs_sphinx/_build/html
sphinx-build -b html docs_sphinx docs_sphinx/_build/html

API reference

nerva_numpy

nerva_numpy: Minimal neural network components built on top of PyTorch.

nerva_numpy.activation_functions

Activation functions and utilities used by the MLP implementation.

nerva_numpy.datasets

In-memory data loader helpers and one-hot conversions.

nerva_numpy.layers

Neural network layers used by the MultilayerPerceptron class.

nerva_numpy.learning_rate

Learning-rate schedulers.

nerva_numpy.loss_functions

Analytic loss functions and their gradients used during training.

nerva_numpy.matrix_operations

Matrix operations built on top of torch to support the math in the library.

nerva_numpy.multilayer_perceptron

A simple multilayer perceptron (MLP) class.

nerva_numpy.optimizers

Optimizers used to adjusts the model's parameters based on the gradients.

nerva_numpy.softmax_functions

Softmax and log-softmax functions together with stable variants.

nerva_numpy.training

Training helpers for the MLP, including a basic SGD loop and CLI glue.

nerva_numpy.utilities

Miscellaneous utilities (formatting, timing, parsing, I/O).

nerva_numpy.weight_initializers

Weight and bias initialization helpers for linear layers.