/gpt-j-fine-tuning-example

Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression

Primary LanguageJupyter Notebook