/HomebrewNLP-MTF

HomebrewNLP in Mesh-TensorFlow flavour for distributed TPU training

Primary LanguagePythonBSD 2-Clause "Simplified" LicenseBSD-2-Clause

Please use Olmax instead.

Original Readme

OBST

Copyright (c) 2020-2022 Yannic Kilcher (yk), Lucas Nestler (clashluke), Shawn Presser (shawwn), Jan (xmaster96)

Quickstart

First, create your VM through google cloud shell with ctpu up --vm-only. This way it has all the necessary permissions to connect to your Buckets and TPUs.
Next, install the requirements with pip on your VM using git clone https://github.com/tensorfork/obst && cd obst && python3 -m pip install -r requirements.txt.
Finally, start a TPU to kick off a training run using python3 main.py --model configs/big_ctx.json --tpu ${YOUR_TPU_NAME}.

Acknowledgements

We also want to explicitly thank