LoRA: Low-Rank Adaptation of Large Language Models
Note:
- Added LORD: Low Rank Decomposition Of Monolingual Code LLMs For One-Shot Compression in
lora_decompose()
TODO:
- MoLoRA (Mixture of Experts for LoRA)
- VeRA: Vector-based Random Matrix Adaptation coding implementation works well without using their suggested initialization strategies, so may need some more checking
Credit: AI chatbot, @Ayush Kaushal, @ontocord , @cloneofsimo