【踩坑】解决Hugging-face下载问题

更多下载方法可以参考https://zhuanlan.zhihu.com/p/663712983

问题1:couldn’t connect to ‘https://huggingface.co’

当运行下面的代码时,会遇到错误

from transformers import SeamlessM4TFeatureExtractor
processor = SeamlessM4TFeatureExtractor.from_pretrained("facebook/w2v-bert-2.0")

错误:

OSError                                   Traceback (most recent call last)
File ~/miniconda3/envs/maskgct/lib/python3.10/site-packages/urllib3/connection.py:198, in HTTPConnection._new_conn(self)
    197 try:
--> 198     sock = connection.create_connection(
    199         (self._dns_host, self.port),
    200         self.timeout,
    201         source_address=self.source_address,
    202         socket_options=self.socket_options,
    203     )
    204 except socket.gaierror as e:

File ~/miniconda3/envs/maskgct/lib/python3.10/site-packages/urllib3/util/connection.py:85, in create_connection(address, timeout, source_address, socket_options)
     84 try:
---> 85     raise err
     86 finally:
     87     # Break explicitly a reference cycle

File ~/miniconda3/envs/maskgct/lib/python3.10/site-packages/urllib3/util/connection.py:73, in create_connection(address, timeout, source_address, socket_options)
     72     sock.bind(source_address)
---> 73 sock.connect(sa)
     74 # Break explicitly a reference cycle

OSError: [Errno 101] Network is unreachable
...
    448 except EntryNotFoundError as e:
    449     if not _raise_exceptions_for_missing_entries:

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like facebook/w2v-bert-2.0 is not the path to a directory containing a file named preprocessor_config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Output is truncated. View as a scrollable element or open in a text editor. Adjust cell output settings...

解决方案非常简单,只需要在文件开头添加

import os
os.environ['HF_ENDPOINT'] = 'https://hf-mirror.com'

设置镜像即可

问题2:HTTPSConnectionPool(host=‘cdn-lfs-us-1.hf-mirror.com’, port=443)

如果按照问题1的配置后出现

Error while downloading from https://cdn-lfs-us-1.hf-mirror.com/repos/d3/b8/d3b6: HTTPSConnectionPool(host='cdn-lfs-us-1.hf-mirror.com', port=443): Read timed out.
Trying to resume download...

或者在hf_hub_download的时候出现异常

from huggingface_hub import hf_hub_download

# download semantic codec ckpt
semantic_code_ckpt = hf_hub_download("amphion/MaskGCT", filename="semantic_codec/model.safetensors")
# download acoustic codec ckpt
codec_encoder_ckpt = hf_hub_download("amphion/MaskGCT", filename="acoustic_codec/model.safetensors")
codec_decoder_ckpt = hf_hub_download("amphion/MaskGCT", filename="acoustic_codec/model_1.safetensors")
# download t2s model ckpt
t2s_model_ckpt = hf_hub_download("amphion/MaskGCT", filename="t2s_model/model.safetensors")
# download s2a model ckpt
s2a_1layer_ckpt = hf_hub_download("amphion/MaskGCT", filename="s2a_model/s2a_model_1layer/model.safetensors")
s2a_full_ckpt = hf_hub_download("amphion/MaskGCT", filename="s2a_model/s2a_model_full/model.safetensors")

出现中断

可以采取huggingface-cli的下载方式
基本用法-下载模型

huggingface-cli download amphion/MaskGCT

huggingface-cli download bigscience/bloom-560m --local-dir bloom-560m

基本用法-下载数据集

huggingface-cli download --repo-type dataset lavita/medical-qa-shared-task-v1-toy

v0.23.0前:有–local-dir-use-symlinks False 参数可选,因为huggingface的工具链默认会使用符号链接来存储下载的文件,导致–local-dir指定的目录中都是一些“链接文件”,真实模型则存储在~/.cache/huggingface下,如果不喜欢这个可以用 --local-dir-use-symlinks False取消这个逻辑。

v0.23.0开始加–local-dir 时会关闭符号链接,所以建议不加–local-dir以便调用

类似

from transformers import SeamlessM4TFeatureExtractor
processor = SeamlessM4TFeatureExtractor.from_pretrained("facebook/w2v-bert-2.0")

from_pretrain 函数可以接收一个模型的id,也可以接收模型的存储路径。

假如我们用浏览器下载了一个模型,存储到服务器的 /data/gpt2 下了,调用的时候你得写模型的绝对路径

AutoModelForCausalLM.from_pretrained("/data/gpt2")

然而如果你用的 huggingface-cli download gpt2 下载,即使你把模型存储到了自己指定的目录,但是你仍然可以简单的用模型的名字来引用他。即:

AutoModelForCausalLM.from_pretrained("gpt2")

原理是因为huggingface工具链会在 .cache/huggingface/ 下维护一份模型的符号链接,无论你是否指定了模型的存储路径 ,缓存目录下都会链接过去,这样可以避免自己忘了自己曾经下过某个模型,此外调用的时候就很方便。

所以用了官方工具,既可以方便的用模型名引用模型,又可以自己把模型集中存在一个自定义的路径,方便管理。

设置hf_transfer加快速度

hf_transfer 依附并兼容 huggingface-cli,是 hugging face 官方专门为提高下载速度基于 Rust 开发的一个模块

(1)安装依赖

pip install -U hf-transfer

(2)设置 HF_HUB_ENABLE_HF_TRANSFER 环境变量为 1。
Linux

export HF_HUB_ENABLE_HF_TRANSFER=1

Windows Powershell

$env:HF_HUB_ENABLE_HF_TRANSFER = 1

这时候用huggingface-cli download就会自动使用hf-transfer

问题3:requests.exceptions.ChunkedEncodingError: (‘Connection broken: IncompleteRead(88760320 bytes read, 1159257560 more expected)’

使用huggingface-cli下载过程中出现了多次超时错误,最终导致连接中断,并抛出ReadTimeoutError和ChunkedEncodingError

可以试试重试(不建议,浪费时间)

rm -rf ~/.cache/huggingface/hub/models--amphion--MaskGCT
huggingface-cli download --resume-download amphion/MaskGCT

也可以试试专用多线程下载器 hfd,参考链接https://zhuanlan.zhihu.com/p/663712983

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值