/alpaca-lora_finetune

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

Guanaco-lora: LoRA for trainin Multilingual Instruction-following LM based on LLaMA

  • 🤗 Try the pretrained model out here

This repository is forked from alpaca-lora, and introduce a method to train more modules like embed/head with lora.

with trained embed and head, you can get better result on multilang performance.

Dataset

We use cleaned version alpaca from alpaca-lora and whole guanaco dataset to train the pretrained model.

guanaco dataset: link

Usage

basically as same as alpaca lora

Example

Some example of instruction-based QA and instruction-based chat image

Resource

Todo...