/lassl

Easy Language Model Pretraining leveraging Huggingface's Transformers and Datasets

Primary LanguagePythonApache License 2.0Apache-2.0

Easy Language Model Pretraining leveraging Huggingface's Transformers and Datasets

What is LASSLHow to Use

English | 한국어

License Issues Issues

What is LASSL

LASSL is a LAnguage framework for Self-Supervised Learning. LASSL aims to provide an easy-to-use framework for pretraining language model by only using Huggingface's Transformers and Datasets.

Environment setting

You can install the required packages following:

pip3 install -r requirements.txt

or you can set environment with poetry following:

# Install poetry 
curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python -
# Environment setting with poetry
poetry install

How to Use

  • Language model pretraining can be divided into three steps: 1. Train Tokenizer, 2. Serialize Corpus, 3.Pretrain Language Model.
  • After preparing corpus following to supported corpus type, you can pretrain your own language model.

1. Train Tokenizer

python3 train_tokenizer.py \
    --corpora_dir $CORPORA_DIR \
    --corpus_type $CORPUS_TYPE \
    --sampling_ratio $SAMPLING_RATIO \
    --model_type $MODEL_TYPE \
    --vocab_size $VOCAB_SIZE \
    --min_frequency $MIN_FREQUENCY
# poetry 이용
poetry run python3 train_tokenizer.py \
    --corpora_dir $CORPORA_DIR \
    --corpus_type $CORPUS_TYPE \
    --sampling_ratio $SAMPLING_RATIO \
    --model_type $MODEL_TYPE \
    --vocab_size $VOCAB_SIZE \
    --min_frequency $MIN_FREQUENCY

2. Serialize Corpora

python3 serialize_corpora.py \
    --model_type $MODEL_TYPE \
    --tokenizer_dir $TOKENIZER_DIR \
    --corpora_dir $CORPORA_DIR \
    --corpus_type $CORPUS_TYPE \
    --max_length $MAX_LENGTH \
    --num_proc $NUM_PROC
# with poetry
poetry run python3 serialize_corpora.py \
    --model_type $MODEL_TYPE \
    --tokenizer_dir $TOKENIZER_DIR \
    --corpora_dir $CORPORA_DIR \
    --corpus_type $CORPUS_TYPE \
    --max_length $MAX_LENGTH \
    --num_proc $NUM_PROC

3. Pretrain Language Model

python3 pretrain_language_model.py --config_path $CONFIG_PATH
# with poetry
poetry run python3 pretrain_language_model.py --config_path $CONFIG_PATH
# When using TPU, use the command below. (Poetry environment does not provide PyTorch XLA as default.)
python3 xla_spawn.py --num_cores $NUM_CORES pretrain_language_model.py --config_path $CONFIG_PATH

Contributors

Boseop Kim Minho Ryu Inje Ryu Jangwon Park Hyoungseok Kim
image1 image2 image3 image4 image5
Github Github Github Github Github

Acknowledgements

LASSL is built with Cloud TPU support from the Tensorflow Research Cloud (TFRC) program.