huggingface/huggingface_hub

downloading large files got stuck at 0% forever when using VPN, but it works when using wget

truebit opened this issue · 2 comments

Describe the bug

I use VPN (specifically ShadowRocket) to access huggingface.com and hf.co, I tried to use huggingface-cli download file. It would work with small files, but for large files such as*.safetensors and *.bin, It would stuck at 0% forever.
I enabled debug in /Users/user/miniforge3/envs/py312/bin/huggingface-cli file using

import logging
logging.basicConfig(level=logging.DEBUG)

but the log did not show any useful info.

By the way, I could use wget to directly download the large files in the same terminal.

Reproduction

No response

Logs

DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): huggingface.co:443
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "GET /api/models/Qwen/Qwen2-VL-2B-Instruct/revision/main HTTP/11" 200 5485
DEBUG:urllib3.connectionpool:https://huggingface.co:443 "HEAD /Qwen/Qwen2-VL-2B-Instruct/resolve/aca78372505e6cb469c4fa6a35c60265b00ff5a4/model-00001-of-00002.safetensors HTTP/11" 302 0
DEBUG:filelock:Attempting to acquire lock 4381438128 on /Users/user/.cache/huggingface/hub/.locks/models--Qwen--Qwen2-VL-2B-Instruct/994ac2b03f97de8bc647d0fe5eba2e4b632b3e28dc03574c29bdfc36cf47e1b9.lock
DEBUG:filelock:Lock 4381438128 acquired on /Users/user/.cache/huggingface/hub/.locks/models--Qwen--Qwen2-VL-2B-Instruct/994ac2b03f97de8bc647d0fe5eba2e4b632b3e28dc03574c29bdfc36cf47e1b9.lock
Removing incomplete file '/Users/user/.cache/huggingface/hub/models--Qwen--Qwen2-VL-2B-Instruct/blobs/994ac2b03f97de8bc647d0fe5eba2e4b632b3e28dc03574c29bdfc36cf47e1b9.incomplete' (hf_transfer=True)
INFO:huggingface_hub.file_download:Removing incomplete file '/Users/user/.cache/huggingface/hub/models--Qwen--Qwen2-VL-2B-Instruct/blobs/994ac2b03f97de8bc647d0fe5eba2e4b632b3e28dc03574c29bdfc36cf47e1b9.incomplete' (hf_transfer=True)
Downloading 'model-00001-of-00002.safetensors' to '/Users/user/.cache/huggingface/hub/models--Qwen--Qwen2-VL-2B-Instruct/blobs/994ac2b03f97de8bc647d0fe5eba2e4b632b3e28dc03574c29bdfc36cf47e1b9.incomplete'
INFO:huggingface_hub.file_download:Downloading 'model-00001-of-00002.safetensors' to '/Users/user/.cache/huggingface/hub/models--Qwen--Qwen2-VL-2B-Instruct/blobs/994ac2b03f97de8bc647d0fe5eba2e4b632b3e28dc03574c29bdfc36cf47e1b9.incomplete'
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): cdn-lfs-us-1.hf.co:443
DEBUG:urllib3.connectionpool:https://cdn-lfs-us-1.hf.co:443 "GET /repos/3f/ce/3fceb28378b4f9468158565789f82717a790f8a43717561e45fb70cfe92e2fa0/994ac2b03f97de8bc647d0fe5eba2e4b632b3e28dc03574c29bdfc36cf47e1b9?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27model-00001-of-00002.safetensors%3B+filename%3D%22model-00001-of-00002.safetensors%22%3B&Expires=1733563615&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTczMzU2MzYxNX19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmhmLmNvL3JlcG9zLzNmL2NlLzNmY2ViMjgzNzhiNGY5NDY4MTU4NTY1Nzg5ZjgyNzE3YTc5MGY4YTQzNzE3NTYxZTQ1ZmI3MGNmZTkyZTJmYTAvOTk0YWMyYjAzZjk3ZGU4YmM2NDdkMGZlNWViYTJlNGI2MzJiM2UyOGRjMDM1NzRjMjliZGZjMzZjZjQ3ZTFiOT9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSoifV19&Signature=m9xVp511wySkX57fHAhdO9HFUhF-VCWB6WwA10jcpFqruRPA3Wp2RxROavhM7o4v7-u38~aNOqM6dXpWOy~5IxrDBT8lLVXiVe4KAK6Rf7v2zBc5iOY-HLMuJnfCHylYYDy5T1FxgbpkjhivpUF702G0P3lo9Gf~FnjOdxu9oJ9fHGYPM1KhIXZsEsEdp0vbU3AH0rOzQ37TgYesTQsQT4z~YPpjKMtozvfMZRAP1NEIyBSAz4a69LvXxJehPEId4OyBd7jeR~9x2Ux~QnhJ1gPwL79HcYnHdsT9tqh4dOaRpq9yF3YKnscvXThfV9ip7r4n~avNGqFTm~SBi1wSKA__&Key-Pair-Id=K24J24Z295AEI9 HTTP/11" 200 3988609112
model-00001-of-00002.safetensors:   0%|                                                                                                               | 0.00/3.99G [00:00<?, ?B/s]

System info

- huggingface_hub version: 0.26.3
- Platform: macOS-15.1.1-arm64-arm-64bit
- Python version: 3.12.4
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Running in Google Colab Enterprise ?: No
- Token path ?: /Users/user/.cache/huggingface/token
- Has saved token ?: True
- Who am I ?: truebit
- Configured git credential helpers: osxkeychain
- FastAI: N/A
- Tensorflow: N/A
- Torch: N/A
- Jinja2: 3.1.4
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 11.0.0
- hf_transfer: 0.1.8
- gradio: N/A
- tensorboard: N/A
- numpy: 2.1.1
- pydantic: 2.9.2
- aiohttp: 3.9.5
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: /Users/user/.cache/huggingface/hub
- HF_ASSETS_CACHE: /Users/user/.cache/huggingface/assets
- HF_TOKEN_PATH: /Users/user/.cache/huggingface/token
- HF_STORED_TOKENS_PATH: /Users/user/.cache/huggingface/stored_tokens
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: True
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10

Hi @truebit, sorry you encountered this issue. Could you retry with hf_transfer disabled ?

export HF_HUB_ENABLE_HF_TRANSFER=0

hf_transfer is a power user tool optimized for performance in standard setups as mentioned in the documentation here. It prioritizes speed but doesn't support features like proxies or resumable downloads. So the default downloader might work better with your VPN setup or at least give us explicit errors if any.

Let us know if this resolves your issue!

@hanouticelina Thanks for the advice. I would try with HF_HUB_ENABLE_HF_TRANSFER disabled