OpenMOSS/AnyGPT

huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/mnt/petrelfs/zhanjun.p/mllm/models/bert-base-uncased'. Use `repo_type` argument if needed.

Closed this issue · 1 comments

There is an error below.

2024-03-26 12:57:36.254249: E external/local_xla/xla/stream_executor/cuda/cuda_dnn.cc:9261] Unable to register cuDNN factory: Attempting to register factory for plugin cuDNN when one has already been registered
2024-03-26 12:57:36.254297: E external/local_xla/xla/stream_executor/cuda/cuda_fft.cc:607] Unable to register cuFFT factory: Attempting to register factory for plugin cuFFT when one has already been registered
2024-03-26 12:57:36.255652: E external/local_xla/xla/stream_executor/cuda/cuda_blas.cc:1515] Unable to register cuBLAS factory: Attempting to register factory for plugin cuBLAS when one has already been registered
2024-03-26 12:57:37.344594: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Since the GPL-licensed package `unidecode` is not installed, using Python's `unicodedata` package which yields worse results.
 NeMo-text-processing :: INFO     :: Creating ClassifyFst grammars.
loading image tokenzier
Traceback (most recent call last):
  File "/content/AnyGPT/anygpt/src/infer/cli_infer_base_model.py", line 337, in <module>
    infer = AnyGPTInference(
  File "/content/AnyGPT/anygpt/src/infer/cli_infer_base_model.py", line 46, in __init__
    self.image_tokenizer = ImageTokenizer(model_path=image_tokenizer_path, load_diffusion=True,
  File "/content/AnyGPT/./seed2/seed_llama_tokenizer.py", line 39, in __init__
    model = Blip2QformerQuantizer.from_pretrained(pretrained_model_path=model_path,
  File "/content/AnyGPT/./seed2/seed_qformer/qformer_quantizer.py", line 354, in from_pretrained
    model = cls(
  File "/content/AnyGPT/./seed2/seed_qformer/qformer_quantizer.py", line 182, in __init__
    self.tokenizer = self.init_tokenizer()
  File "/content/AnyGPT/./seed2/seed_qformer/blip2.py", line 38, in init_tokenizer
    tokenizer = BertTokenizer.from_pretrained("/mnt/petrelfs/zhanjun.p/mllm/models/bert-base-uncased", truncation_side=truncation_side)
  File "/usr/local/lib/python3.10/dist-packages/transformers/tokenization_utils_base.py", line 1940, in from_pretrained
    resolved_config_file = cached_file(
  File "/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py", line 429, in cached_file
    resolved_file = hf_hub_download(
  File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 111, in _inner_fn
    validate_repo_id(arg_value)
  File "/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_validators.py", line 159, in validate_repo_id
    raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/mnt/petrelfs/zhanjun.p/mllm/models/bert-base-uncased'. Use `repo_type` argument if needed.

Thank you for your reminder. The issue has been fixed, and you can try the latest code. If there are any problems, please feel free to contact us.