LoRa and diff with bitsandbytes
RonanKMcGovern opened this issue · 0 comments
RonanKMcGovern commented
- What changes would I need to make for GPTQ to support LoRa for Llama 2?
- What's the main difference between GPTQ vs bitsandbytes? Is it that GPTQ re-adjusts the weights to keep the same loss function shape?