ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported.
dermoritz opened this issue · 0 comments
dermoritz commented
if i try to add models from hugging face via ui settings like "TheBloke/Wizard-Vicuna-7B-Uncensored-HF" or "ehartford/Wizard-Vicuna-7B-Uncensored"
(i also tried others)
i get this error:
Arguments: ('task(u21crq97y5fd7ci)', 'TheBloke/Wizard-Vicuna-7B-Uncensored-HF', 1, 10, 'fish & chips', 20, 150, 1, 1, 1, 1, 'Top K', 12, 0.15) {}
Traceback (most recent call last):
File "D:\stableDiffusion\stable-diffusion-webui\modules\call_queue.py", line 57, in f
res = list(func(*args, **kwargs))
File "D:\stableDiffusion\stable-diffusion-webui\modules\call_queue.py", line 37, in f
res = func(*args, **kwargs)
File "D:\stableDiffusion\stable-diffusion-webui\extensions\stable-diffusion-webui-promptgen\scripts\promptgen.py", line 98, in generate
current.tokenizer = transformers.AutoTokenizer.from_pretrained(path)
File "D:\stableDiffusion\stable-diffusion-webui\venv\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 638, in from_pretrained
raise ValueError(
ValueError: Tokenizer class LlamaTokenizer does not exist or is not currently imported.
i also tried to "pip install git+https://github.com/huggingface/transformers" from here but it didn't helped.
any suggestions or tips?