Lightning-AI/lit-llama

This codebase has so many errors it is completely useless and unusable

Abecid opened this issue · 2 comments

Abecid commented

precision = "bf16-true"

is unsupported but in the code for lora-finetuning

fabric.init_module

causes a does not exist error.

with fabric.device

results in an error in full fine-tuning script

Overall very poor experience and poor documentation. Garbage

rasbt commented

I don't know about the with fabric.device but let me address the other two

  1. precision = "bf16-true"

  2. fabric.init_module

with more explicit warnings and suggestions via a PR shortly.

rasbt commented

@Abecid What error are you getting with bfloat16. I think it's only supported in Ampere and newer, but it appears that it now also works on older T4's and CPU. Just tested it. Maybe it's a PyTorch version thing.

If you have time and don't mind spending a few more minutes, could you let me know the error code you are getting and PyTorch version to look into it further? I could then add a more explicit warning to save the hassle for future users.
Screenshot 2023-08-08 at 1 15 44 PM