Upload tokenizer?
jtarthur5 opened this issue · 3 comments
jtarthur5 commented
Your test case in the ReadMe gives the following error:
File "/.../transformers/tokenization_utils_fast.py", line 107, in __init__
fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
Exception: No such file or directory (os error 2)
To get this working, we had to use: bert-large-uncased
for the tokenizer. Looks like there is no tokenizer.json
in the hub to pull down.
fractalego commented
Thank you, I'll have a look. My guess is the original repository changed.
fractalego commented
Hi, I have uploaded the tokenizer files in the HF repo. I am still investigating the issue but it should work now.
Please let me know if there is any other issue.
jtarthur5 commented
Thanks!