加载Bert预训练模型时报错:huggingface_hub.utils._validators.HFValidationError

具体报错情况如下:
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': './bert/bert_base_cased_ICEWS14'. Userepo_typeargument if needed.

很简单,我download下来的代码没有并没有模型参数的文件,所以去https://huggingface.co/bert-base-cased/tree/main下载然后放到文件夹里就好了。

注意BertTokenizerFast.from_pretrained里面放的是模型参数文件所在的文件夹而不是文件本身。
在这里插入图片描述

Traceback (most recent call last): File "E:\工作系统软件一览\Python\Lib\site-packages\transformers\utils\hub.py", line 424, in cached_files hf_hub_download( File "E:\工作系统软件一览\Python\Lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "E:\工作系统软件一览\Python\Lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/model/Sam/'. Use `repo_type` argument if needed. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\Project_blacker\AI\管理.py", line 8, in <module> tokenizer = AutoTokenizer.from_pretrained("/model/Sam/") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\工作系统软件一览\Python\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 946, in from_pretrained tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\工作系统软件一览\Python\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 778, in get_tokenizer_config resolved_config_file = cached_file( ^^^^^^^^^^^^ File "E:\工作系统软件一览\Python\Lib\site-packages\transformers\utils\hub.py", line 266, in cached_file file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\工作系统软件一览\Python\Lib\site-packages\transformers\utils\hub.py", line 471, in cached_files _get_cache_file_to_return(path_or_repo_id, filename, cache_dir, revision) for filename in full_filenames ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\工作系统软件一览\Python\Lib\site-packages\transformers\utils\hub.py", line 134, in _get_cache_file_to_return resolved_file = try_to_load_from_cache(path_or_repo_id, full_filename, cache_dir=cache_dir, revision=revision) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\工作系统软件一览\Python\Lib\site-packages\huggingface_hub\utils\_validators.py", line 106, in _inner_fn validate_repo_id(arg_value) File "E:\工作系统软件一览\Python\Lib\site-packages\huggingface_hub\utils\_validators.py", line 154, in validate_repo_id raise HFValidationError( huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/model/Sam/'. Use `repo_type` argument if needed.
最新发布
05-10
(env) (base) PS D:\MiniGPT-4> python demo.py --cfg-path eval_configs/minigpt4_eval.yaml Initializing Chat Loading VIT Loading VIT Done Loading Q-Former Traceback (most recent call last): File "D:\MiniGPT-4\env\lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( File "D:\MiniGPT-4\env\lib\site-packages\huggingface_hub\utils\_validators.py", line 120, in _inner_fn return fn(*args, **kwargs) File "D:\MiniGPT-4\env\lib\site-packages\huggingface_hub\file_download.py", line 1259, in hf_hub_download raise LocalEntryNotFoundError( huggingface_hub.utils._errors.LocalEntryNotFoundError: Connection error, and we cannot find the requested files in the disk cache. Please try again or make sure your Internet connection is on. During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\MiniGPT-4\demo.py", line 57, in <module> model = model_cls.from_config(model_config).to('cuda:0') File "D:\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 241, in from_config model = cls( File "D:\MiniGPT-4\minigpt4\models\mini_gpt4.py", line 64, in __init__ self.Qformer, self.query_tokens = self.init_Qformer( File "D:\MiniGPT-4\minigpt4\models\blip2.py", line 47, in init_Qformer encoder_config = BertConfig.from_pretrained("bert-base-uncased") File "D:\MiniGPT-4\env\lib\site-packages\transformers\configuration_utils.py", line 546, in from_pretrained config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs) File "D:\MiniGPT-4\env\lib\site-packages\transformers\configuration_utils.py", line 573, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "D:\MiniGPT-4\env\lib\site-packages\transformers\configuration_utils.py", line 628, in _get_config_dict resolved_config_file = cached_file( File "D:\MiniGPT-4\env\lib\site-packages\transformers\utils\hub.py", line 443, in cached_file raise EnvironmentError( OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like bert-base-uncased is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
07-23
评论 6
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值