Code from hugging_face transformers In this code I will use KoBERT model, not DistilBERT model
** Code will be modified to make it possible to choose between both models from command line
Included functions
get_distilkobert_model() get_kobert_model()
Imported modules
transformer - BERT, DistilBERT tokenization.py
Included functions
Imported modules
Included functions
Imported modules
Included functions
Imported modules
- Huggingface Transformers document
- DistilKoBERT
- KorQuAD-beginner
- A Recurrent BERT-based Model for Question Generation, 2019, Chan et al.
- BERT has a Mouth, and It Must Speak: BERT as a Markov Random Field Language Model, Wang et al., 2019