microsoft/LoRA

Using gradient checkpoint with LoRA

dudskrk opened this issue · 0 comments

When I use gradient checkpoint with LoRA, it seems that all gradients will be 'None', so lora_A and B will not be updated.
besides, turn off gradient checkpoint option, it works well.

How can I use gradient checkpoint with LoRA?