To install the package, run the following command:
pip install -r requirements.txt
If you want to use the library in multi-nodes, you need to install the below packages:
module load openmpi/4.x.x
pip install mpi4py
To install the FlashAttention, run the following command: (GPU is required)
pip install ninja packaging wheel
pip install flash-attn --no-build-isolation
If you use ABCI to run the experiments, install scripts are available in kotoba-recipes/install.sh
.
scripts/abci/instruction contains the scripts to run instruction tunings on ABCI.
If you want to use custom instructions, you need to modify the src/llama_recipes/datasets/alpaca_dataset.py
.
scripts/abci/ contains the scripts to run LLM continual pre-training on ABCI. 7B, 13B, 70B directories contain the scripts to run the experiments with the corresponding model size (Llama 2).