【pip】raise MaxRetryError(_pool, url, error or ResponseError(cause))

报错

raise MaxRetryError(_pool, url, error or ResponseError(cause))
pip._vendor.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host=‘files.pythonhosted.org’, port=443): Max retries exceeded with url: /packages/f2/94/3af39d34be01a24a6e65433d19e107099374224905f1e0cc6bbe1fd22a2f/argparse-1.4.0-py2.py3-none-any.whl (Caused by ProtocolError(‘Connection aborted.’, OSError(107, ‘传输端点尚未连接’)))

原因分析

镜像不可用,网络不可达。

解决方案

在用pip安装的时候,添加国内镜像

例如:原本执行的命令是:

pip3 install argparse psutil pygresql pyyaml

改为:

pip3 install argparse psutil pygresql pyyaml -i http://pypi.douban.com/simple/ --trusted-host pypi.douban.com

参考https://www.cnblogs.com/mabingxue/p/8872365.html

Traceback (most recent call last): File "D:\develop\anaconda\lib\site-packages\urllib3\connectionpool.py", line 715, in urlopen httplib_response = self._make_request( File "D:\develop\anaconda\lib\site-packages\urllib3\connectionpool.py", line 404, in _make_request self._validate_conn(conn) File "D:\develop\anaconda\lib\site-packages\urllib3\connectionpool.py", line 1060, in _validate_conn conn.connect() File "D:\develop\anaconda\lib\site-packages\urllib3\connection.py", line 419, in connect self.sock = ssl_wrap_socket( File "D:\develop\anaconda\lib\site-packages\urllib3\util\ssl_.py", line 449, in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl( File "D:\develop\anaconda\lib\site-packages\urllib3\util\ssl_.py", line 493, in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) File "D:\develop\anaconda\lib\ssl.py", line 500, in wrap_socket return self.sslsocket_class._create( File "D:\develop\anaconda\lib\ssl.py", line 1040, in _create self.do_handshake() File "D:\develop\anaconda\lib\ssl.py", line 1309, in do_handshake self._sslobj.do_handshake() ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1123) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\develop\anaconda\lib\site-packages\requests\adapters.py", line 667, in send resp = conn.urlopen( File "D:\develop\anaconda\lib\site-packages\urllib3\connectionpool.py", line 801, in urlopen retries = retries.increment( File "D:\develop\anaconda\lib\site-packages\urllib3\util\retry.py", line 594, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.github.com', port=443): Max retries exceeded with url: /repos/ultralytics/assets/releases/tags/v8.3.0 (Caused by SSLError(SSLCertV
03-23
(rag_env) C:\Users\Lenovo\rag_scrapydb\src>python retrieve.py Traceback (most recent call last): File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connection.py", line 198, in _new_conn sock = connection.create_connection( File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\util\connection.py", line 85, in create_connection raise err File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\util\connection.py", line 73, in create_connection sock.connect(sa) TimeoutError: timed out The above exception was the direct cause of the following exception: Traceback (most recent call last): File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connectionpool.py", line 787, in urlopen response = self._make_request( File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connectionpool.py", line 488, in _make_request raise new_e File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connectionpool.py", line 464, in _make_request self._validate_conn(conn) File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connectionpool.py", line 1093, in _validate_conn conn.connect() File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connection.py", line 753, in connect self.sock = sock = self._new_conn() File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connection.py", line 207, in _new_conn raise ConnectTimeoutError( urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPSConnection object at 0x00000219801180D0>, 'Connection to huggingface.co timed out. (connect timeout=10)') The above exception was the direct cause of the following exception: Traceback (most recent call last): File "D:\annaCONDA\envs\rag_env\lib\site-packages\requests\adapters.py", line 667, in send resp = conn.urlopen( File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\connectionpool.py", line 841, in urlopen retries = retries.increment( File "D:\annaCONDA\envs\rag_env\lib\site-packages\urllib3\util\retry.py", line 519, in increment raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /chatglm-6b/resolve/main/config.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x00000219801180D0>, 'Connection to huggingface.co timed out. (connect timeout=10)')) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error metadata = get_hf_file_metadata( File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata r = _request_wrapper( File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper response = _request_wrapper( File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\file_download.py", line 309, in _request_wrapper response = http_backoff(method=method, url=url, **params, retry_on_exceptions=(), retry_on_status_codes=(429,)) File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\utils\_http.py", line 310, in http_backoff response = session.request(method=method, url=url, **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\requests\sessions.py", line 589, in request resp = self.send(prep, **send_kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\requests\sessions.py", line 703, in send r = adapter.send(request, **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\utils\_http.py", line 96, in send return super().send(request, *args, **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\requests\adapters.py", line 688, in send raise ConnectTimeout(e, request=request) requests.exceptions.ConnectTimeout: (MaxRetryError("HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /chatglm-6b/resolve/main/config.json (Caused by ConnectTimeoutError(<urllib3.connection.HTTPSConnection object at 0x00000219801180D0>, 'Connection to huggingface.co timed out. (connect timeout=10)'))"), '(Request ID: cc52c822-3cb1-45a3-92ab-cc340779b5e2)') The above exception was the direct cause of the following exception: Traceback (most recent call last): File "D:\annaCONDA\envs\rag_env\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files hf_hub_download( File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download return _hf_hub_download_to_cache_dir( File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir _raise_on_head_call_error(head_call_error, force_download, local_files_only) File "D:\annaCONDA\envs\rag_env\lib\site-packages\huggingface_hub\file_download.py", line 1648, in _raise_on_head_call_error raise LocalEntryNotFoundError( huggingface_hub.errors.LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "C:\Users\Lenovo\rag_scrapydb\src\retrieve.py", line 22, in <module> tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True) File "D:\annaCONDA\envs\rag_env\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 1002, in from_pretrained config = AutoConfig.from_pretrained( File "D:\annaCONDA\envs\rag_env\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1197, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\transformers\configuration_utils.py", line 608, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\transformers\configuration_utils.py", line 667, in _get_config_dict resolved_config_file = cached_file( File "D:\annaCONDA\envs\rag_env\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs) File "D:\annaCONDA\envs\rag_env\lib\site-packages\transformers\utils\hub.py", line 543, in cached_files raise OSError( OSError: We couldn't connect to 'https://huggingface.co' to load the files, and couldn't find them in the cached files. Check your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'
最新发布
07-06
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值