[Minor] Possible typos in weight initialization
awgu opened this issue · 0 comments
awgu commented
The recent commit a0a92e0 flipped A
and B
in the comment for the LoRA Linear
module:
Lines 119 to 125 in a0a92e0
The LoRA Embedding
module similarly has the initialization flipped (not sure if this is intentional):
Lines 55 to 60 in a0a92e0
Following the paper, I would expect
nn.init.normal_(self.lora_A)
and nn.init.zeros_(self.lora_B)
.
I can open a PR to fix these if you want (though, I cannot seem to save the file without removing trailing whitespaces for some reason).