/nn

🧠 Minimal implementations of neural network architectures and layers in PyTorch with side-by-side notes

Primary LanguagePythonMIT LicenseMIT

Screenshot

This is a collection of simple PyTorch implementations of neural networks and related algorithms. These implementations are documented with explanations, and the website renders these as side-by-side formatted notes. We believe these would help you understand these algorithms better.

We are actively maintaining this repo and adding new implementations.

Modules

Transformers module contains implementations for multi-headed attention and relative multi-headed attention.

✨ LSTM

✨ Sketch RNN

✨ Optimizers

Installation

pip install labml_nn

Citing LabML

If you use LabML for academic research, please cite the library using the following BibTeX entry.

@misc{labml,
 author = {Varuna Jayasiri, Nipun Wijerathne},
 title = {LabML: A library to organize machine learning experiments},
 year = {2020},
 url = {https://lab-ml.com/},
}