chapter1中使用gradio部署demo缺少my-bert-model文件
hrj-11055 opened this issue · 1 comments
您好,在看chapter1的README.md的gradio在线部署demo时。不清楚my-bert-model是如何来的, 我已经训练出分类模型了,存放在experiments文件夹下,但不清楚接下来怎么做?难道要把experiments重命名为my-bert-model吗
最后,我在运行python app.py
报错内容:
Traceback (most recent call last):
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status
response.raise_for_status()
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/requests/models.py", line 1024, in raise_for_status
raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/my-bert-model/resolve/main/config.json
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/transformers/utils/hub.py", line 402, in cached_file
resolved_file = hf_hub_download(
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/utils/_deprecation.py", line 101, in inner_f
return f(*args, **kwargs)
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1240, in hf_hub_download
return _hf_hub_download_to_cache_dir(
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1347, in _hf_hub_download_to_cache_dir
_raise_on_head_call_error(head_call_error, force_download, local_files_only)
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1854, in _raise_on_head_call_error
raise head_call_error
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1751, in _get_metadata_or_catch_error
metadata = get_hf_file_metadata(
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn
return fn(*args, **kwargs)
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 1673, in get_hf_file_metadata
r = _request_wrapper(
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 376, in _request_wrapper
response = _request_wrapper(
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/file_download.py", line 400, in _request_wrapper
hf_raise_for_status(response)
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/huggingface_hub/utils/_errors.py", line 352, in hf_raise_for_status
raise RepositoryNotFoundError(message, response) from e
huggingface_hub.utils._errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-66d870c1-067c38990fd3e6bf6e964c6f;4bb95157-14d4-4d4e-b93f-b7b7a0fe1312)
Repository Not Found for url: https://huggingface.co/my-bert-model/resolve/main/config.json.
Please make sure you specified the correct repo_id
and repo_type
.
If you are trying to access a private or gated repo, make sure you are authenticated.
Invalid username or password.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/mark/dive-into-llms/documents/chapter1/TextClassification/app.py", line 13, in
config = AutoConfig.from_pretrained(model_dir, num_labels=3, finetuning_task="text-classification")
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 965, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/transformers/configuration_utils.py", line 632, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/transformers/configuration_utils.py", line 689, in _get_config_dict
resolved_config_file = cached_file(
File "/home/mark/miniconda3/envs/llm3.9/lib/python3.9/site-packages/transformers/utils/hub.py", line 425, in cached_file
raise EnvironmentError(
OSError: my-bert-model is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo either by logging in with huggingface-cli login
or by passing token=<your_token>
app.py代码第10行:model_dir='my-bert-model',定义了本地模型的路径,但代码有误,使得from_pretrained函数访问了无效地址https://huggingface.co/my-bert-model,应修改为model_dir='./my-bert-model',这样,from_pretrained函数即可读取本地文件夹'./my-bert-model'中的模型。
综上,直接修改model_dir='./experiments'应该可以解决你的两个问题。感谢关注~