nerva_torch
nerva_torch: Minimal neural network components built on top of PyTorch.
This package provides small, educational implementations of layers, activation functions, loss functions, optimizers, learning-rate schedulers, softmax utilities, and simple training helpers. It is designed for readability and experimentation rather than performance.
Modules
Activation functions and utilities used by the MLP implementation. |
|
In-memory data loader helpers and one-hot conversions. |
|
Neural network layers used by the MultilayerPerceptron class. |
|
Learning-rate schedulers. |
|
Analytic loss functions and their gradients used during training. |
|
Thin wrappers around PyTorch loss modules for comparison and testing. |
|
Matrix operations built on top of torch to support the math in the library. |
|
A simple multilayer perceptron (MLP) class. |
|
Optimizers used to adjusts the model's parameters based on the gradients. |
|
Softmax and log-softmax functions together with stable variants. |
|
Training helpers for the MLP, including a basic SGD loop and CLI glue. |
|
Miscellaneous utilities (formatting, timing, parsing, I/O). |
|
Weight and bias initialization helpers for linear layers. |