What subword tokenizer do you use?
conan1024hao opened this issue · 2 comments
conan1024hao commented
Hi, I am just curious about that what subword tokenizer did you use when training the model, BPE or WordPiece?
bage79 commented
We trained this model using Sentencepiece Unigram tokenizer.
conan1024hao commented
@bage79 감사합니다!