Inner Class 内部类 (to be Cont...)

本文详细探讨了Java内部类的特点,包括内部类如何访问外部类成员、内部类实例化过程及其实现闭包的能力。此外,还介绍了静态内部类与普通内部类的区别,并解释了回调机制在Java中的实现。

内部类的一些笔记:

  • 外部类(Outer Class) 可以访问内部类 (Inner Class) 的非 private 成员
  • 内部类 (Inner Class) 可以访问外部类(Outer Class) 的所以成员,包括 private 成员
  • 内部类 (Inner Class) 会自动持有外部类(Outer Class) 的引用

Nested Class (静态内部类):

     当把内部类声明为 static 时,这个内部类也称为 Nested Class. 由于被声明为 static, 他和外部类的通信将受一定程度的影响. 一般的内部类不可以有静态块,静态域或者其他静态内部类; 而Nested Class 无此限制.

  • 创建 Nested Class 的实例时候不需要外部类的实例
  • Nested Class  不可以访问外部类的非 static 成员

Why Inner Class?

    内部类还具有以下特征:

  1. 内部类可以有多个实例,每一个实例,对于外部类来说都有独立的信息
  2. 一个外部类可以有多个内部类,每个内部类可以针对同一抽象有不同的实现

    从以上两点可以看出,内部类可以解决多继承的问题.

 

    3.  内部类的实例创建并不与外部类的实例的创建紧联在一起

 

因此,内部类什么时候实例化,与外部类的实例化没多大关系,甚至可以单独实例化内部类.

 

Closure & Call back:

 

    Closure is an object that contains infomation from the scope in which it was created.

          ------<<Think in Java>>

 

  某种意义上,内部类就是 Closure.

Call Back 回调:

    回调的概念来自C语言的函数指针。一般的函数调用是通过给定方法名和参数,程序在编译期就知道了应该调用调用栈中的哪个方法。这是静态调用。有时候程序需要在运行时动态的调用某些方法,这种调用不能是显式的,而是通过函数指针隐式的动态调用。函数指针可以理解为指向函数地址的变量,它在C/C++语言中有自己的声明方式。例如,

      void function(); // 函数定义

      ptr = function;  // p为函数指针,指向函数 function 的地址

在其他方法中调用 call(ptr) ,那么call()方法将调用ptr指向的函数。

     Java语言中没有函数指针的概念,那么java是怎么实现回调的呢?

 

[ISOLATED-PROCESS][2025-07-28 03:06:18,645][accelerate.utils.modeling][INFO] - Device 0 seems unavailable, Proceeding to check subsequent devices. [ISOLATED-PROCESS][2025-07-28 03:06:18,649][accelerate.utils.modeling][INFO] - Device 1 seems unavailable, Proceeding to check subsequent devices. Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:00<00:00, 4.44it/s] [ISOLATED-PROCESS][2025-07-28 03:06:19,626][pytorch][INFO] - + Enabling eval mode [ISOLATED-PROCESS][2025-07-28 03:06:19,644][pytorch][INFO] - + Cleaning up backend temporary directory [ISOLATED-PROCESS][2025-07-28 03:06:19,649][inference][INFO] - + Preparing inputs for backend pytorch [ISOLATED-PROCESS][2025-07-28 03:06:19,650][inference][INFO] - + Warming up backend for Text Generation [ISOLATED-PROCESS][2025-07-28 03:06:19,698][process][ERROR] - + Sending traceback to main process [ISOLATED-PROCESS][2025-07-28 03:06:19,710][process][INFO] - + Exiting isolated process Exception in thread Thread-3: Traceback (most recent call last): File "/usr/local/python3.10.17/lib/python3.10/threading.py", line 1016, in _bootstrap_inner [2025-07-28 03:06:20,010][process][ERROR] - + Received traceback from isolated process Traceback (most recent call last): File "/usr/local/python3.10.17/lib/python3.10/pdb.py", line 1723, in main pdb._runscript(mainpyfile) File "/usr/local/python3.10.17/lib/python3.10/pdb.py", line 1583, in _runscript self.run(statement) File "/usr/local/python3.10.17/lib/python3.10/bdb.py", line 598, in run exec(cmd, globals, locals) File "<string>", line 1, in <module> File "/models/z50051264/bitsandbytes-main/benchmarking/inference_benchmark.py", line 137, in <module> benchmark_report = Benchmark.launch(benchmark_config) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/benchmark/base.py", line 51, in launch report = launcher.launch(worker=Benchmark.run, worker_args=[config]) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/launchers/process/launcher.py", line 66, in launch raise ChildProcessError(response["traceback"]) ChildProcessError: Traceback (most recent call last): File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/launchers/process/launcher.py", line 103, in target report = worker(*worker_args) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/benchmark/base.py", line 78, in run report = scenario.run(backend) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/scenarios/inference/scenario.py", line 136, in run self.warmup_text_generation() File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/scenarios/inference/scenario.py", line 196, in warmup_text_generation self.backend.generate(self.inputs, self.config.generate_kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/backends/pytorch/backend.py", line 446, in generate return self.pretrained_model.generate(**inputs, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/generation/utils.py", line 2597, in generate result = self._sample( File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/generation/utils.py", line 3557, in _sample outputs = self(**model_inputs, return_dict=True) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/utils/generic.py", line 969, in wrapper output = func(self, *args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 703, in forward outputs: BaseModelOutputWithPast = self.model( File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/utils/generic.py", line 969, in wrapper output = func(self, *args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 436, in forward layer_outputs = decoder_layer( File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/modeling_layers.py", line 48, in __call__ return super().__call__(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 257, in forward hidden_states, self_attn_weights = self.self_attn( File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 159, in forward query_states = self.q_proj(hidden_states).view(hidden_shape).transpose(1, 2) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/nn/modules.py", line 271, in forward out = bnb.matmul_4bit(x, self.weight.t(), bias=bias, quant_state=self.weight.quant_state) File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/autograd/_functions.py", line 250, in matmul_4bit return MatMul4Bit.apply(A, B, out, bias, quant_state) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/autograd/function.py", line 575, in apply return super().apply(*args, **kwargs) # type: ignore[misc] File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/autograd/_functions.py", line 197, in forward output = torch.matmul(A, F.dequantize_4bit(B, quant_state).to(A.dtype).t()) File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/functional.py", line 345, in dequantize_4bit torch.npu.set_device(A.device) # reset context File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch_npu/npu/utils.py", line 78, in set_device device_id = _get_device_index(device, optional=True) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch_npu/npu/utils.py", line 184, in _get_device_index raise ValueError('Expected a npu device, but got: {}'.format(device) + pta_error(ErrCode.VALUE)) ValueError: Expected a npu device, but got: cpu [ERROR] 2025-07-28-03:06:19 (PID:2091, Device:0, RankID:-1) ERR00003 PTA invalid value Uncaught exception. Entering post mortem debugging Running 'cont' or 'step' will restart the program > /usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/launchers/process/launcher.py(66)launch() -> raise ChildProcessError(response["traceback"]) (Pdb)
最新发布
07-29
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值