how to create proxies pool

How to code a proxies pool

代理的基本知识

代理服务器类型很多:HTTP代理,FTP代理,SOCKS代理。各种代理的特性如下:

  • HTTP代理:能够代理客户机的HTTP访问,主要是代理浏览器访问网页,它的端口一般为80、8080、3128等。
  • FTP代理:能够代理客户机上的FTP软件访问FTP服务器,它的端口一般为21、2121。
  • RTSP代理:代理客户机上的Realplayer访问Real流媒体服务器的代理,其端口一般为554。
  • POP3代理:代理客户机上的邮件软件用POP3方式收发邮件,端口一般为110。
  • SOCKS代理:SOCKS代理与其他类型的代理不同,它只是简单地传递数据包,而并不关心是何种应用协议,既可以是HTTP请求,所以SOCKS代理服务器比其他类型的代理服务器速度要快得多。SOCKS代理又分为SOCKS4和SOCKS5,二者不同的是SOCKS4代理只支持TCP协议(即传输控制协议),而SOCKS5代理则既支持TCP协议又支持UDP协议(即用户数据包协议),还支持各种身份验证机制、服务器端域名解析等。SOCK4能做到的SOCKS5都可得到,但SOCKS5能够做到的SOCKS则不一定能做到,比如我们常用的聊天工具QQ在使用代理时就要求用SOCKS5代理,因为它需要使用UDP协议来传输数据。

HTTP代理相关知识

  • 透明代理:也叫普通代理,它不但改变了我们的请求信息,还会传送真实的IP地址。从:HTTP_X_FORWARDED_FOR等代理信息可以查到我们IP地址!
  • 匿名代理:普通匿名代理,它能隐藏客户机的真实IP,但会改变我们的请求信息。它不传送正式ip,但是可能会发送HTTP_VIA、HTTP_PROXY_CONNECTION
  • 信息,还是可以通过这些判断出使用了代理!
  • 高级匿名代理:不改变客户机的请求,这样在服务器看来就像有个真正的客户浏览器在访问它,这时客户的真实IP是隐藏的,服务器端不会认为我们使用了代理!
  • HTTP通道:http代理服务器支持Connect请求,这类代理服务器基本可以代理所有软件,如:QQ,FoxMail,FTP等等,不支持通道的HTTP代理,基本上只支持简单的Http
  • GET,POST等请求服务!

###SOCKS代理知识

  • SOCKS5:常见SOCKS代理有Socks4,socks5,不过目前基本上以socks5代理为主,它基本支持所有客户端请求协议,Http,Ftp,Smtp等,可以具备高级匿名代理隐藏功能!

运用场景

这在某些情况下比较有用,比如IP被封了,或者比如IP访问的次数受到限制等等。

爬虫抓取思路

  • 分布式爬虫
  • ip代理池
  • 设置延迟抓取

网站反爬思路

  • 针对并发数高的爬虫,可手工识别;也可在服务器端设置每个ip连接次数。
  • 通过User-Agent识别爬虫。此种情况下是针对并发连接数不是很高的爬虫而设置的。
  • 通过网站流量统计系统和日志分析来识别爬虫。有些爬虫喜欢修改User-Agent信息来伪装自己,把自己伪装成一个真实浏览器的User-Agent信息,让你无法有效的识别。这种情况下我们可以通过网站流量系统记录的真实用户访问IP来进行识别。
  • 网站的实时反爬虫防火墙实现策略

思路

定时抓代理页面,然后分析后提取.
定时获取代理源,根据需要测试的目标网站,用代理ip进行速度,超时等一些性能指标的测试.以便获取针对目标网站抓取的最佳ip.

亮点难点

就是要根据若干提供的目标网站,做多重判定(测试到每个目标网站的速度,超时).

代理池现状

一般的代理网站只有单一测试. 改进思路: 构建一个目标网站测试url池.

编写代理池相关网页

含代理数据相关网页

:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe D:\pythonProject_request\run.py ============================= test session starts ============================= platform win32 -- Python 3.10.11, pytest-8.3.5, pluggy-1.5.0 -- C:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe cachedir: .pytest_cache metadata: {'Python': '3.10.11', 'Platform': 'Windows-10-10.0.26100-SP0', 'Packages': {'pytest': '8.3.5', 'pluggy': '1.5.0'}, 'Plugins': {'allure-pytest': '2.14.0', 'base-url': '2.1.0', 'html': '4.1.1', 'metadata': '3.1.1', 'order': '1.3.0', 'ordering': '0.6', 'rerunfailures': '15.1', 'xdist': '3.8.0'}, 'JAVA_HOME': 'C:\\Program Files\\Java\\jdk1.8.0_131', 'Base URL': ''} rootdir: D:\pythonProject_request configfile: pytest.ini plugins: allure-pytest-2.14.0, base-url-2.1.0, html-4.1.1, metadata-3.1.1, order-1.3.0, ordering-0.6, rerunfailures-15.1, xdist-3.8.0 collecting ... collected 1 item testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] FAILED ================================== FAILURES =================================== ______________________ TestAllCase.test_login[caseinfo0] ______________________ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: > self._validate_conn(conn) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:1097: in _validate_conn conn.connect() C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:642: in connect sock_and_verified = _ssl_wrap_socket_and_match_hostname( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:783: in _ssl_wrap_socket_and_match_hostname ssl_sock = ssl_wrap_socket( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:471: in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:515: in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:513: in wrap_socket return self.sslsocket_class._create( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1071: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <ssl.SSLSocket [closed] fd=-1, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0> block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1342: SSLCertVerificationError During handling of the above exception, another exception occurred: self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/adminapi/login', query='Content_Type=application%2Fx-www-from-urlencoded', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:791: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x0000018B5D7EE8F0> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) > raise new_e E urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:492: SSLError The above exception was the direct cause of the following exception: self = <requests.adapters.HTTPAdapter object at 0x0000018B5D7BA680> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:845: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' response = None error = SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')) _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x0000018B5D7EE170> _stacktrace = <traceback object at 0x0000018B5D392700> def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Retry: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\retry.py:515: MaxRetryError During handling of the above exception, another exception occurred: self = <testcaes.test_all_case.TestAllCase object at 0x0000018B5D7EE140> caseinfo = {'request': {'data': {'account': 'admin', 'imgcode': '', 'pwd': 123456}, 'method': 'post', 'params': {'Content_Type': 'application/x-www-from-urlencoded'}, 'url': 'https://192.168.116.137/adminapi/login'}, 'validate': None} @pytest.mark.parametrize("caseinfo",read_yaml(yaml_path)) def duyo(self,caseinfo): new_caseinfo = verify_yaml(caseinfo) #在读取yaml内容后调取用例模板方法验证用例是否足够 #发送请求 > RequestUtil().send_all_request(**new_caseinfo.request) testcaes\test_all_case.py:17: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ common\requests_util.py:27: in send_all_request res = RequestUtil.sess.request(**kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:589: in request resp = self.send(prep, **send_kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <requests.adapters.HTTPAdapter object at 0x0000018B5D7BA680> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. > raise SSLError(e, request=request) E requests.exceptions.SSLError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:698: SSLError ============================== warnings summary =============================== C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277: PytestAssertRewriteWarning: Module already imported so cannot be rewritten: allure_pytest self._mark_plugins_for_rewrite(hook) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500: PytestConfigWarning: No files were found in testpaths; consider removing or adjusting your testpaths configuration. Searching recursively from the current directory instead. self.args, self.args_source = self._decide_args( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441: PytestConfigWarning: Unknown config option: Python_classes self._warn_or_fail_if_strict(f"Unknown config option: {key}\n") -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info =========================== FAILED testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] - reques... ======================== 1 failed, 3 warnings in 0.26s ======================== 进程已结束,退出代码为 0
08-01
C:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe D:\pythonProject_request\run.py ============================= test session starts ============================= platform win32 -- Python 3.10.11, pytest-8.3.5, pluggy-1.5.0 -- C:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe cachedir: .pytest_cache metadata: {'Python': '3.10.11', 'Platform': 'Windows-10-10.0.26100-SP0', 'Packages': {'pytest': '8.3.5', 'pluggy': '1.5.0'}, 'Plugins': {'allure-pytest': '2.14.0', 'base-url': '2.1.0', 'html': '4.1.1', 'metadata': '3.1.1', 'order': '1.3.0', 'ordering': '0.6', 'rerunfailures': '15.1', 'xdist': '3.8.0'}, 'JAVA_HOME': 'C:\\Program Files\\Java\\jdk1.8.0_131', 'Base URL': ''} rootdir: D:\pythonProject_request configfile: pytest.ini plugins: allure-pytest-2.14.0, base-url-2.1.0, html-4.1.1, metadata-3.1.1, order-1.3.0, ordering-0.6, rerunfailures-15.1, xdist-3.8.0 collecting ... collected 1 item testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] FAILED ================================== FAILURES =================================== ______________________ TestAllCase.test_login[caseinfo0] ______________________ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: > self._validate_conn(conn) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:1097: in _validate_conn conn.connect() C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:642: in connect sock_and_verified = _ssl_wrap_socket_and_match_hostname( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:783: in _ssl_wrap_socket_and_match_hostname ssl_sock = ssl_wrap_socket( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:471: in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:515: in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:513: in wrap_socket return self.sslsocket_class._create( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1071: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <ssl.SSLSocket [closed] fd=-1, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0> block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1342: SSLCertVerificationError During handling of the above exception, another exception occurred: self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/adminapi/login', query='Content_Type=application%2Fx-www-from-urlencoded', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:791: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) > raise new_e E urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:492: SSLError The above exception was the direct cause of the following exception: self = <requests.adapters.HTTPAdapter object at 0x000002DA1B51A710> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:845: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' response = None error = SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')) _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> _stacktrace = <traceback object at 0x000002DA1B279980> def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Retry: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\retry.py:515: MaxRetryError During handling of the above exception, another exception occurred: self = <testcaes.test_all_case.TestAllCase object at 0x000002DA1B54E1D0> caseinfo = {'request': {'data': {'account': 'admin', 'imgcode': '', 'pwd': 123456}, 'method': 'post', 'params': {'Content_Type': 'application/x-www-from-urlencoded'}, 'url': 'https://192.168.116.137/adminapi/login'}, 'validate': None, 'verify': False} @pytest.mark.parametrize("caseinfo",read_yaml(yaml_path)) def duyo(self,caseinfo): new_caseinfo = verify_yaml(caseinfo) #在读取yaml内容后调取用例模板方法验证用例是否足够 #发送请求 > RequestUtil().send_all_request(**new_caseinfo.request) testcaes\test_all_case.py:17: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ common\requests_util.py:27: in send_all_request res = RequestUtil.sess.request(**kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:589: in request resp = self.send(prep, **send_kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <requests.adapters.HTTPAdapter object at 0x000002DA1B51A710> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. > raise SSLError(e, request=request) E requests.exceptions.SSLError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:698: SSLError ============================== warnings summary =============================== C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277: PytestAssertRewriteWarning: Module already imported so cannot be rewritten: allure_pytest self._mark_plugins_for_rewrite(hook) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500: PytestConfigWarning: No files were found in testpaths; consider removing or adjusting your testpaths configuration. Searching recursively from the current directory instead. self.args, self.args_source = self._decide_args( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441: PytestConfigWarning: Unknown config option: Python_classes self._warn_or_fail_if_strict(f"Unknown config option: {key}\n") -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info =========================== FAILED testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] - reques... ======================== 1 failed, 3 warnings in 0.25s ======================== 进程已结束,退出代码为 0
08-01
基于51单片机,实现对直流电机的调速、测速以及正反转控制。项目包含完整的仿真文件、源程序、原理图和PCB设计文件,适合学习和实践51单片机在电机控制方面的应用。 功能特点 调速控制:通过按键调整PWM占空比,实现电机的速度调节。 测速功能:采用霍尔传感器非接触式测速,实时显示电机转速。 正反转控制:通过按键切换电机的正转和反转状态。 LCD显示:使用LCD1602液晶显示屏,显示当前的转速和PWM占空比。 硬件组成 主控制器:STC89C51/52单片机(与AT89S51/52、AT89C51/52通用)。 测速传感器:霍尔传感器,用于非接触式测速。 显示模块:LCD1602液晶显示屏,显示转速和占空比。 电机驱动:采用双H桥电路,控制电机的正反转和调速。 软件设计 编程语言:C语言。 开发环境:Keil uVision。 仿真工具:Proteus。 使用说明 液晶屏显示: 第一行显示电机转速(单位:转/分)。 第二行显示PWM占空比(0~100%)。 按键功能: 1键:加速键,短按占空比加1,长按连续加。 2键:减速键,短按占空比减1,长按连续减。 3键:反转切换键,按下后电机反转。 4键:正转切换键,按下后电机正转。 5键:开始暂停键,按一下开始,再按一下暂停。 注意事项 磁铁和霍尔元件的距离应保持在2mm左右,过近可能会在电机转动时碰到霍尔元件,过远则可能导致霍尔元件无法检测到磁铁。 资源文件 仿真文件:Proteus仿真文件,用于模拟电机控制系统的运行。 源程序:Keil uVision项目文件,包含完整的C语言源代码。 原理图:电路设计原理图,详细展示了各模块的连接方式。 PCB设计:PCB布局文件,可用于实际电路板的制作。
【四旋翼无人机】具备螺旋桨倾斜机构的全驱动四旋翼无人机:建模与控制研究(Matlab代码、Simulink仿真实现)内容概要:本文围绕具备螺旋桨倾斜机构的全驱动四旋翼无人机展开研究,重点进行了系统建模与控制策略的设计与仿真验证。通过引入螺旋桨倾斜机构,该无人机能够实现全向力矢量控制,从而具备更强的姿态调节能力和六自由度全驱动特性,克服传统四旋翼欠驱动限制。研究内容涵盖动力学建模、控制系统设计(如PID、MPC等)、Matlab/Simulink环境下的仿真验证,并可能涉及轨迹跟踪、抗干扰能力及稳定性分析,旨在提升无人机在复杂环境下的机动性与控制精度。; 适合人群:具备一定控制理论基础和Matlab/Simulink仿真能力的研究生、科研人员及从事无人机系统开发的工程师,尤其适合研究先进无人机控制算法的技术人员。; 使用场景及目标:①深入理解全驱动四旋翼无人机的动力学建模方法;②掌握基于Matlab/Simulink的无人机控制系统设计与仿真流程;③复现硕士论文级别的研究成果,为科研项目或学术论文提供技术支持与参考。; 阅读建议:建议结合提供的Matlab代码与Simulink模型进行实践操作,重点关注建模推导过程与控制器参数调优,同时可扩展研究不同控制算法的性能对比,以深化对全驱动系统控制机制的理解。
### 关于响应代理在Web开发中的应用 #### 响应代理的概念 响应代理是指在网络通信过程中,位于客户端和服务器之间的中间件。它负责拦截并处理来自客户端的请求以及返回给客户端的响应数据。通过这种方式,可以实现诸如安全检测、流量分析等功能。 #### 实现原理 为了理解响应代理的工作机制,可以从以下几个方面来探讨: - **HTTP协议解析**:当浏览器向目标网站发起访问时,实际发送的是遵循HTTP/HTTPS标准的数据包。这些数据包包含了请求头(Headers)、方法(GET, POST等)以及其他参数信息。而响应则由状态码、消息体等内容构成[^2]。 - **抓取与修改**:像ZAP或Burp Suite这样的工具能够设置特定端口号作为代理服务地址,在此期间任何经过该端口发出或者接收的信息都会被截获下来供进一步操作——比如查看POST提交的内容或是篡改Cookie字段等等。 - **安全性考量**:由于涉及到敏感资料传输的安全隐患问题,因此这类软件通常会提供加密连接选项以保障隐私不泄露;同时也会有严格的权限控制措施防止非法入侵行为发生。 #### 使用场景举例 假设开发者想要测试某个API接口是否存在SQL注入漏洞,则可以通过配置好环境变量指向本地运行着的代理程序之后再执行相应的动作。此时所有交互过程都将透明化展示出来便于观察是否有异常情况出现。 ```javascript // 示例代码片段用于说明如何利用Node.js创建简单的HTTP代理服务器 const httpProxy = require('http-proxy'); const proxyServer = new httpProxy.createProxyServer({}); proxyServer.on('error', (err, req, res) => { console.error(`Error occurred while trying to proxy request: ${err.message}`); res.writeHead(500, { 'Content-Type': 'text/plain' }); res.end('Something went wrong.'); }).listen(8080); console.log('Listening on port 8080...'); ```
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值