Like NanoGPT but tiny. It is also inspired by Tinygrad, Pytorch and MLX.
The main objective of this project is to be as didactic as possible, avoiding optimizations that make the code difficult to understand.
I hope we can understand how to train and run a model like GPT-3 using as few libraries as possible and programming everything from scratch, including the library to train and run the model.
The current recommended way to install TinyGPT is from source.
$ git clone https://github.com/isaacperez/tinyGPT.git
$ cd tinygpt
$ python -m pip install -e .
Don't forget the .
at the end!
[TO DO]
Documentation along with a quick start guide can be found in the docs/ directory.
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
You need to install pytest:
$ python -m pip install pytest
and TinyGPT, then run:
$ pytest