Issues
- 3
- 5
- 2
- 0
- 9
- 1
Is config.rank <= 0 ever true?
#19 opened by def-roth - 0
Updated candle-core, candle-nn [0.5.0] release breaks installation of candle-lora and candle-lora-macro dependencies
#15 opened by Andycharalambous - 5
- 4
- 0
error[E0277]: expected a `Fn<(&candle_core::Tensor,)>` closure, found `BatchNorm`
#8 opened by EricLBuehler - 6
Model Merging
#6 opened by okpatil4u - 2
any example for llama_lora training
#7 opened by arthasyou - 1
- 3
QA-LoRA Implementation and Review
#3 opened by okpatil4u - 0
Add more LoRA transformers
#4 opened by EricLBuehler - 5
Examples for Llama model architecture
#2 opened by okpatil4u - 4