修改modelscope 和 huggingface_hub 库默认模型下载存储路径

modelscopehuggingface_hub 库下载的模型默认是存在C盘下的.cache目录(windows),下面是修改默认模型下载路径的方法,适用于 Windows/Linux/macOS 系统:


一、HuggingFace 默认路径修改

方法 1:通过环境变量全局修改
  • 环境变量名HF_HOMEHUGGINGFACE_HUB_CACHE

  • 操作步骤

    1. Linux/macOS
      # 修改 .bashrc 或 .zshrc 文件
      echo 'export HF_HOME="/your/custom/path/huggingface"' >> ~/.bashrc
      source ~/.bashrc  # 使配置生效
      
    2. Windows
      • 打开 控制面板 → 系统和安全 → 系统 → 高级系统设置 → 环境变量
      • 用户变量 中新建变量:
        变量名HF_HOME
        变量值D:\your\custom\path\huggingface(需替换为实际路径)。

    说明:设置后,HuggingFace 的模型、数据集等缓存将统一存储在此路径下的 hub 子目录中。

方法 2:代码中临时指定路径

在加载模型时通过 cache_dir 参数指定:

from transformers import AutoModel
model = AutoModel.from_pretrained("google-bert/bert-base-uncased", cache_dir="/your/custom/path")

优点:灵活,适合单次任务;缺点:需每次手动添加。


二、ModelScope 默认路径修改

方法 1:通过环境变量全局修改
  • 环境变量名MODELSCOPE_CACHE

  • 操作步骤(与 HuggingFace 类似):

    1. Linux/macOS
      echo 'export MODELSCOPE_CACHE="/your/custom/path/modelscope"' >> ~/.bashrc
      source ~/.bashrc
      
    2. Windows
      • 在环境变量中新建用户变量:
        变量名MODELSCOPE_CACHE
        变量值D:\your\custom\path\modelscope

    说明:ModelScope 的模型和数据集将存储在此路径下的 hub 目录。

方法 2:代码中指定路径

使用 snapshot_download 时设置 cache_dir

from modelscope import snapshot_download
snapshot_download(model_id="Shanghai_AI_Laboratory/internlm2-chat-7b", cache_dir="/your/custom/path")

三、注意事项

  1. 路径权限:确保目标目录存在且有写入权限。
  2. 环境变量优先级
    • HuggingFace 优先读取 HUGGINGFACE_HUB_CACHE,若未设置则使用 HF_HOME
    • ModelScope 仅依赖 MODELSCOPE_CACHE
  3. 跨平台兼容性:Windows 路径建议使用反斜杠 \ 或双斜杠 \\,Linux/macOS 使用正斜杠 /
  4. 清理旧缓存:修改路径后,原 .cache 目录下的文件需手动删除。

四、验证是否生效

  • HuggingFace
    from huggingface_hub import whoami
    print(whoami())  # 输出当前缓存路径
    
  • ModelScope
    import os
    print(os.environ.get("MODELSCOPE_CACHE"))  # 检查环境变量值
    

建议优先使用 环境变量全局配置,避免重复代码修改。

(model) root@autodl-container-861840a478-809903b5:~/autodl-fs/chromadb# source /etc/network_turbo 设置成功 注意:仅限于学术用途,不承诺稳定性保证 (model) root@autodl-container-861840a478-809903b5:~/autodl-fs/chromadb# python Jina_scope.py jina-embeddings-v3 jina model loading... Traceback (most recent call last): File "/root/miniconda3/envs/model/lib/python3.11/site-packages/transformers/utils/hub.py", line 470, in cached_files hf_hub_download( File "/root/miniconda3/envs/model/lib/python3.11/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1008, in hf_hub_download return _hf_hub_download_to_cache_dir( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1115, in _hf_hub_download_to_cache_dir _raise_on_head_call_error(head_call_error, force_download, local_files_only) File "/root/miniconda3/envs/model/lib/python3.11/site-packages/huggingface_hub/file_download.py", line 1636, in _raise_on_head_call_error raise LocalEntryNotFoundError( huggingface_hub.errors.LocalEntryNotFoundError: Cannot find the requested files in the disk cache and outgoing traffic has been disabled. To enable hf.co look-ups and downloads online, set 'local_files_only' to False. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/autodl-fs/data/chromadb/Jina_scope.py", line 33, in <module> jina = Jina() ^^^^^^ File "/autodl-fs/data/chromadb/Jina_scope.py", line 20, in __init__ self.model = AutoModel.from_pretrained(model_path, trust_remote_code=True, local_files_only=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/modelscope/utils/hf_util/patcher.py", line 291, in from_pretrained module_obj = module_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 531, in from_pretrained config, kwargs = AutoConfig.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1162, in from_pretrained config_class = get_class_from_dynamic_module( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 569, in get_class_from_dynamic_module final_module = get_cached_module_file( ^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/transformers/dynamic_module_utils.py", line 372, in get_cached_module_file resolved_module_file = cached_file( ^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/transformers/utils/hub.py", line 312, in cached_file file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/root/miniconda3/envs/model/lib/python3.11/site-packages/transformers/utils/hub.py", line 543, in cached_files raise OSError( OSError: We couldn't connect to 'https://huggingface.co' to load the files, and couldn't find them in the cached files. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
06-20
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值