/tinylora

finetuning using lora and tinygrad. PEFT but better.

Primary LanguagePythonApache License 2.0Apache-2.0

Tiny Lora

Writing lora using the tinygrad library. This is work in progress. This is similar to Peft library from huggingface link here but with tinygrad and tinygrad principles.

Usage

# import your models  as model
from lora import MakeLora
config  = {'rank': 8, 'alpha': 0.9}
model = MakeLora(model, config)

TODO

  • Lora for Linear eg in testing_lora.py
  • Lora for Tensors eg in testing_lora.py
  • Lora for Transformers
  • Lora for CNNs
  • Fine tune mistral 7b using tinygrad.