解决attempt to index local 'self' (a number value)的错

本文深入探讨了在Lua编程中使用self参数的方法,解释了如何通过self调用成员函数并解决常见错误,提供了关于self参数在成员函数内部和外部的详细说明。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Account = {balance = 0}
function Account:new (o)
	o = o or {}
	setmetatable(o, self)
	self.__index = self
	return o
end
function Account:deposit (v)
	self.balance = self.balance + v
end
function Account:withdraw (v)
	if v > self.balance then error"insufficient funds" end
	self.balance = self.balance - v
end


SpecialAccount = Account:new()
SpecialAccount.deposit(8)

    运行后出现出现错误,报attempt to index local 'self' (a number value)错误。其原因的是

In general you should call member functions by :.

In Lua, colon : represents call of a function, supplying self as a first parameter.

Thus

A:foo()

Is roughly equal to

A.foo(A)

If you don't specify A as in A.foo(), the body of the function will try to reference self parameter, which hasn't been filled neither explicitly nor implicitly.

Note that if you call it from inside of the member function, self will be already available:

-- inside foo()-- these two are analogousself:bar()self.bar(self)

All of this information you'll find in any good Lua book/tutorial.


转载于:https://my.oschina.net/u/1053706/blog/269759

C:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe D:\pythonProject_request\run.py ============================= test session starts ============================= platform win32 -- Python 3.10.11, pytest-8.3.5, pluggy-1.5.0 -- C:\Users\20685\AppData\Local\Programs\Python\Python310\python.exe cachedir: .pytest_cache metadata: {'Python': '3.10.11', 'Platform': 'Windows-10-10.0.26100-SP0', 'Packages': {'pytest': '8.3.5', 'pluggy': '1.5.0'}, 'Plugins': {'allure-pytest': '2.14.0', 'base-url': '2.1.0', 'html': '4.1.1', 'metadata': '3.1.1', 'order': '1.3.0', 'ordering': '0.6', 'rerunfailures': '15.1', 'xdist': '3.8.0'}, 'JAVA_HOME': 'C:\\Program Files\\Java\\jdk1.8.0_131', 'Base URL': ''} rootdir: D:\pythonProject_request configfile: pytest.ini plugins: allure-pytest-2.14.0, base-url-2.1.0, html-4.1.1, metadata-3.1.1, order-1.3.0, ordering-0.6, rerunfailures-15.1, xdist-3.8.0 collecting ... collected 1 item testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] FAILED ================================== FAILURES =================================== ______________________ TestAllCase.test_login[caseinfo0] ______________________ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: > self._validate_conn(conn) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:1097: in _validate_conn conn.connect() C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:642: in connect sock_and_verified = _ssl_wrap_socket_and_match_hostname( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connection.py:783: in _ssl_wrap_socket_and_match_hostname ssl_sock = ssl_wrap_socket( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:471: in ssl_wrap_socket ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\ssl_.py:515: in _ssl_wrap_socket_impl return ssl_context.wrap_socket(sock, server_hostname=server_hostname) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:513: in wrap_socket return self.sslsocket_class._create( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1071: in _create self.do_handshake() _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <ssl.SSLSocket [closed] fd=-1, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=0> block = False @_sslcopydoc def do_handshake(self, block=False): self._check_connected() timeout = self.gettimeout() try: if timeout == 0.0 and block: self.settimeout(None) > self._sslobj.do_handshake() E ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\ssl.py:1342: SSLCertVerificationError During handling of the above exception, another exception occurred: self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None, preload_content = False decode_content = False, response_kw = {} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/adminapi/login', query='Content_Type=application%2Fx-www-from-urlencoded', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( # type: ignore[override] self, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | bool | int | None = None, redirect: bool = True, assert_same_host: bool = True, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, pool_timeout: int | None = None, release_conn: bool | None = None, chunked: bool = False, body_pos: _TYPE_BODY_POSITION | None = None, preload_content: bool = True, decode_content: bool = True, **response_kw: typing.Any, ) -> BaseHTTPResponse: """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param bool preload_content: If True, the response's body will be preloaded into memory. :param bool decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``preload_content`` which defaults to ``True``. :param bool chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = preload_content # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = to_str(_encode_target(url)) else: url = to_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() # type: ignore[attr-defined] headers.update(self.proxy_headers) # type: ignore[union-attr] # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout # type: ignore[assignment] # Is this a closed/new connection that requires CONNECT tunnelling? if self.proxy is not None and http_tunnel_required and conn.is_closed: try: self._prepare_proxy(conn) except (BaseSSLError, OSError, SocketTimeout) as e: self._raise_timeout( err=e, url=self.proxy.url, timeout_value=conn.timeout ) raise # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Make the request on the HTTPConnection object > response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, retries=retries, response_conn=response_conn, preload_content=preload_content, decode_content=decode_content, **response_kw, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:791: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' body = 'account=admin&pwd=123456&imgcode=' headers = {'User-Agent': 'python-requests/2.32.3', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '33', 'Content-Type': 'application/x-www-form-urlencoded'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) timeout = Timeout(connect=None, read=None, total=None), chunked = False response_conn = <urllib3.connection.HTTPSConnection object at 0x000002DA1B54E980> preload_content = False, decode_content = False, enforce_content_length = True def _make_request( self, conn: BaseHTTPConnection, method: str, url: str, body: _TYPE_BODY | None = None, headers: typing.Mapping[str, str] | None = None, retries: Retry | None = None, timeout: _TYPE_TIMEOUT = _DEFAULT_TIMEOUT, chunked: bool = False, response_conn: BaseHTTPConnection | None = None, preload_content: bool = True, decode_content: bool = True, enforce_content_length: bool = True, ) -> BaseHTTPResponse: """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param response_conn: Set this to ``None`` if you will handle releasing the connection or set the connection to have the response release it. :param preload_content: If True, the response's body will be preloaded during construction. :param decode_content: If True, will attempt to decode the body based on the 'content-encoding' header. :param enforce_content_length: Enforce content length checking. Body returned by server must match value of Content-Length header, if present. Otherwise, raise error. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = Timeout.resolve_default_timeout(timeout_obj.connect_timeout) try: # Trigger any extra validation we need to do. try: self._validate_conn(conn) except (SocketTimeout, BaseSSLError) as e: self._raise_timeout(err=e, url=url, timeout_value=conn.timeout) raise # _validate_conn() starts the connection to an HTTPS proxy # so we need to wrap errors with 'ProxyError' here too. except ( OSError, NewConnectionError, TimeoutError, BaseSSLError, CertificateError, SSLError, ) as e: new_e: Exception = e if isinstance(e, (BaseSSLError, CertificateError)): new_e = SSLError(e) # If the connection didn't successfully connect to it's proxy # then there if isinstance( new_e, (OSError, NewConnectionError, TimeoutError, SSLError) ) and (conn and conn.proxy and not conn.has_connected_to_proxy): new_e = _wrap_proxy_error(new_e, conn.proxy.scheme) > raise new_e E urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:492: SSLError The above exception was the direct cause of the following exception: self = <requests.adapters.HTTPAdapter object at 0x000002DA1B51A710> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\connectionpool.py:845: in urlopen retries = retries.increment( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST' url = '/adminapi/login?Content_Type=application%2Fx-www-from-urlencoded' response = None error = SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)')) _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x000002DA1B54E380> _stacktrace = <traceback object at 0x000002DA1B279980> def increment( self, method: str | None = None, url: str | None = None, response: BaseHTTPResponse | None = None, error: Exception | None = None, _pool: ConnectionPool | None = None, _stacktrace: TracebackType | None = None, ) -> Retry: """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.BaseHTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or method is None or not self._is_method_retryable(method): raise reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" response_redirect_location = response.get_redirect_location() if response_redirect_location: redirect_location = response_redirect_location status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): reason = error or ResponseError(cause) > raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type] E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\urllib3\util\retry.py:515: MaxRetryError During handling of the above exception, another exception occurred: self = <testcaes.test_all_case.TestAllCase object at 0x000002DA1B54E1D0> caseinfo = {'request': {'data': {'account': 'admin', 'imgcode': '', 'pwd': 123456}, 'method': 'post', 'params': {'Content_Type': 'application/x-www-from-urlencoded'}, 'url': 'https://192.168.116.137/adminapi/login'}, 'validate': None, 'verify': False} @pytest.mark.parametrize("caseinfo",read_yaml(yaml_path)) def duyo(self,caseinfo): new_caseinfo = verify_yaml(caseinfo) #在读取yaml内容后调取用例模板方法验证用例是否足够 #发送请求 > RequestUtil().send_all_request(**new_caseinfo.request) testcaes\test_all_case.py:17: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ common\requests_util.py:27: in send_all_request res = RequestUtil.sess.request(**kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:589: in request resp = self.send(prep, **send_kwargs) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\sessions.py:703: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <requests.adapters.HTTPAdapter object at 0x000002DA1B51A710> request = <PreparedRequest [POST]>, stream = False timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection_with_tls_context( request, verify, proxies=proxies, cert=cert ) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, chunked=chunked, ) except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. > raise SSLError(e, request=request) E requests.exceptions.SSLError: HTTPSConnectionPool(host='192.168.116.137', port=443): Max retries exceeded with url: /adminapi/login?Content_Type=application%2Fx-www-from-urlencoded (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1007)'))) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\requests\adapters.py:698: SSLError ============================== warnings summary =============================== C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1277: PytestAssertRewriteWarning: Module already imported so cannot be rewritten: allure_pytest self._mark_plugins_for_rewrite(hook) C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1500: PytestConfigWarning: No files were found in testpaths; consider removing or adjusting your testpaths configuration. Searching recursively from the current directory instead. self.args, self.args_source = self._decide_args( C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441 C:\Users\20685\AppData\Local\Programs\Python\Python310\lib\site-packages\_pytest\config\__init__.py:1441: PytestConfigWarning: Unknown config option: Python_classes self._warn_or_fail_if_strict(f"Unknown config option: {key}\n") -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info =========================== FAILED testcaes/test_all_case.py::TestAllCase::test_login[caseinfo0] - reques... ======================== 1 failed, 3 warnings in 0.25s ======================== 进程已结束,退出代码为 0
最新发布
08-01
<think>嗯,用户的问题是关于Lua脚本中出现的"attempt to index a nil value"误,特别是提到了"埃斯顿",可能是指某种工业自动化设备相关的脚本。用户引用了三个误示例,核心问题都是尝试操作nil值。 从用户提供的引用来看,误场景分三类: 1. 引用[1]是全局模块访问失败(luasql) 2. 引用[2]是元表设置时的全局函数调用问题(print) 3. 引用[3]是字符串转换函数缺失(tostring) 用户可能正在调试工业控制设备的Lua脚本,这类环境通常有严格的模块限制。需要重点考虑: - 设备特定的Lua环境限制(比如禁用某些标准库) - 未正确加载设备提供的API模块 - 变量作用域管理不当 解决方案需要覆盖: ✓ 基础nil值诊断流程 ✓ 设备环境特殊考量 ✓ 作用域管理技巧 ✓ 误预防实践 特别要注意引用[2]中的元表设置误案例,这是常见陷阱。用户代码里的meta.Set("_index", ...) 这种写法在Lua 5.2+会失效,应该用__index。 对于"埃斯顿"这个关键词,需要提醒用户检查设备厂商的API文档,因为工业设备常会封装自定义Lua环境。可能用户漏掉了某个初始化步骤,比如没调用estun.init()之类的设备专用函数。 最后生成的三个相关问题都聚焦在工业场景下的Lua调试,特别是: 1. 设备环境差异(标准库缺失) 2. 设备API的特殊用法 3. 实时系统的调试限制 这样能精准匹配用户可能的后续需求。</think>在Lua脚本中出现`attempt to index a nil value`误,表示你尝试对一个`nil`值进行索引操作(如`.`或`[]`)。结合"埃斯顿"(可能指工业自动化设备)的上下文,以下是系统级分析和解决方案: --- ### **误原因与解决方案** #### 1. **变量未初始化** - **问题**:访问未赋值的变量(默认为`nil`) - **修复**:初始化变量或检查拼写 ```lua -- 误示例 local robot = nil print(robot.speed) -- 尝试索引nil值 -- 修复:初始化对象 local robot = { speed = 100 } -- 正确 ``` #### 2. **模块未加载** - **问题**:未成功加载依赖库(如埃斯顿设备的Lua API) - **修复**:确认库路径并显式加载 ```lua -- 检查库是否存在 if not package.loaded["estun_api"] then local ok, err = pcall(require, "estun_api") if not ok then print("加载API失败:", err) end end ``` #### 3. **函数返回值未验证** - **问题**:函数可能返回`nil`(如设备通信失败) - **修复**:添加返回值校验 ```lua local status = device:get_status() if status then -- 检查非nil print(status.code) else print("设备未响应") end ``` #### 4. **元表设置误**(参考引用[2]) - **问题**:元表方法命名误(`_index`应为`__index`) - **修复**:使用双下划线 ```lua -- 误代码(引用[2]) meta.Set("_index", m_LuaEnv.Global) -- 修正:改为__index meta.Set("__index", m_LuaEnv.Global) -- 正确 ``` --- ### **调试建议** 1. **定位出行**: - 检查误日志中的行号(如`stdin:1`) 2. **打印变量类型**: ```lua print(type(your_var)) -- 输出变量类型 ``` 3. **安全访问**: ```lua -- 使用and短路避免索引nil if device and device.sensor then device.sensor:read() end ``` --- ### **埃斯顿设备特殊注意事项** 1. **API兼容性**: - 确认脚本与设备固件版本匹配 - 检查厂商文档中API的命名变更(如`estun.move()`是否已弃用) 2. **环境隔离**: - 工业设备可能限制标准库(如`os`、`io`) - 使用设备提供的专用函数替代(如用`estun.log()`代替`print`) > 提示:埃斯顿设备的Lua环境通常需调用`estun.init()`初始化硬件连接,遗漏此步骤可能导致对象为`nil`[^2]。 ---
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值