- 🤗 Try the pretrained model out here
This repository is forked from alpaca-lora, and introduce a method to train more modules like embed/head with lora.
with trained embed and head, you can get better result on multilang performance.
We use cleaned version alpaca from alpaca-lora and whole guanaco dataset to train the pretrained model.
guanaco dataset: link
basically as same as alpaca lora
Some example of instruction-based QA and instruction-based chat
Todo...