DAMO-NLP-SG/Video-LLaMA

Hugging Face demo runtime error

sihoseanhan opened this issue · 2 comments

I get the below error when opening your demo link.

Runtime error
Initializing Chat
Traceback (most recent call last):
  File "/home/user/app/app.py", line 67, in <module>
    model = model_cls.from_config(model_config).to('cuda:{}'.format(args.gpu_id))
  File "/home/user/app/video_llama/models/video_llama.py", line 395, in from_config
    model = cls(
  File "/home/user/app/video_llama/models/video_llama.py", line 73, in __init__
    self.tokenizer = self.init_tokenizer()
  File "/home/user/app/video_llama/models/blip2.py", line 32, in init_tokenizer
    tokenizer = BertTokenizer.from_pretrained("bert-base-uncased")
  File "/home/user/.pyenv/versions/3.10.13/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 1795, in from_pretrained
    raise EnvironmentError(
OSError: Can't load tokenizer for 'bert-base-uncased'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'bert-base-uncased' is the correct path to a directory containing all relevant files for a BertTokenizer tokenizer.

Thanks for your kind reminder. It's probably because of the network issue. Now the demo already returned to normal.

I still meet this error. Can you help me check the network?

Thanks for your kind reminder. It's probably because of the network issue. Now the demo already returned to normal.