tensorchord/modelz-llm

Tokenizer class LLaMATokenizer does not exist or is not currently imported.

Opened this issue · 1 comments

could you please help me ,I meet the follwing issue, thank you so much!

base) ubuntu@VM-48-6-ubuntu:~ modelz-llm -m decapoda-research/llama-7b-hf
Namespace(model='decapoda-research/llama-7b-hf', emb_model='sentence-transformers/all-MiniLM-L6-v2', dry_run=False, device='auto', port=8000, worker=1)
2023-10-18 23:55:43,181 - 5543 - INFO - model.py:32 - loading model and embedding: <class 'transformers.models.auto.modeling_auto.AutoModelForCausalLM'>(decapoda-research/llama-7b-hf) <class 'transformers.models.auto.tokenization_auto.AutoTokenizer'>(decapoda-research/llama-7b-hf)
Process SpawnProcess-1:
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
self.run()
File "/home/ubuntu/anaconda3/lib/python3.9/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/ubuntu/anaconda3/lib/python3.9/site-packages/modelz_llm/uds.py", line 27, in init
self.func = cls(**kwargs)
File "/home/ubuntu/anaconda3/lib/python3.9/site-packages/modelz_llm/model.py", line 39, in init
self.tokenizer = tokenizer_cls.from_pretrained(
File "/home/ubuntu/anaconda3/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 748, in from_pretrained
raise ValueError(
ValueError: Tokenizer class LLaMATokenizer does not exist or is not currently imported.
Traceback (most recent call last):
File "/home/ubuntu/anaconda3/bin/modelz-llm", line 8, in
sys.exit(main())
File "/home/ubuntu/anaconda3/lib/python3.9/site-packages/modelz_llm/cli.py", line 62, in main
app = build_falcon_app(args)
File "/home/ubuntu/anaconda3/lib/python3.9/site-packages/modelz_llm/falcon_service.py", line 128, in build_falcon_app
raise RuntimeError("failed to start the service")
RuntimeError: failed to start the service
(base) ubuntu@VM-48-6-ubuntu:~$