This codebase has so many errors it is completely useless and unusable
Abecid opened this issue · 2 comments
precision = "bf16-true"
is unsupported but in the code for lora-finetuning
fabric.init_module
causes a does not exist error.
with fabric.device
results in an error in full fine-tuning script
Overall very poor experience and poor documentation. Garbage
I don't know about the with fabric.device
but let me address the other two
-
precision = "bf16-true"
-
fabric.init_module
with more explicit warnings and suggestions via a PR shortly.
@Abecid What error are you getting with bfloat16. I think it's only supported in Ampere and newer, but it appears that it now also works on older T4's and CPU. Just tested it. Maybe it's a PyTorch version thing.
If you have time and don't mind spending a few more minutes, could you let me know the error code you are getting and PyTorch version to look into it further? I could then add a more explicit warning to save the hassle for future users.