Uncaught Error: Element type is invalid: expected a string (for built-in components) or a class/func

import { CircularProgress } from '@material-ui/core/CircularProgress'

该成

import  CircularProgress  from '@material-ui/core/CircularProgress'

Content script loaded and running. router:1 Failed to load module script: Expected a JavaScript module script but the server responded with a MIME type of "". Strict MIME type checking is enforced for module scripts per HTML spec. App.vue:1 Failed to load module script: Expected a JavaScript module script but the server responded with a MIME type of "application/octet-stream". Strict MIME type checking is enforced for module scripts per HTML spec. style.css:1 Failed to load module script: Expected a JavaScript module script but the server responded with a MIME type of "text/css". Strict MIME type checking is enforced for module scripts per HTML spec. translate-api-f5d75aa0.js:6 [debug logInfo] select-words content initSelectWords undefined at : content_manager.js:1 message: onCheckIsNotMiniWindow background-a2b11b7c.js:5 canBeTranslated func zh false false false true content-5a0f5a82.js:2 canBeTranslated false translate-api-f5d75aa0.js:16 Uncaught (in promise) {cmd: 'beacon-report-mes', err: {…}} (anonymous) @ translate-api-f5d75aa0.js:16 Show 1 more frame Show less novel_content.script:947 siteType: other_page originalUrl host: 127.0.0.1:8848 novel_content.script:947 originalUrl: http://127.0.0.1:8848/china-3d-map1/index.html novel_content.script:947 DomDistiller debug level: 0 novel_content.script:3983 extractCost:34 novel_content.script:3993 version: 20231205 x5_distiller.script:1 siteType: other_page originalUrl host: 127.0.0.1:8848 x5_distiller.script:1 DomDistiller debug level: 0 x5_distiller.script:1 ContentExtractor::extractContent, userSelectEnabled = true, this = Class$S106@1 x5_distiller.script:1 findNext[object HTMLHtmlElement]http://127.0.0.1:8848/china-3d-map1/index.html x5_distiller.script:1 findPrevious[object HTMLHtmlElement]http://127.0.0.1:8848/china-3d-map1/index.html x5_distiller.script:1 userSelectEnabled: true
最新发布
10-16
[ISOLATED-PROCESS][2025-07-28 03:06:18,645][accelerate.utils.modeling][INFO] - Device 0 seems unavailable, Proceeding to check subsequent devices. [ISOLATED-PROCESS][2025-07-28 03:06:18,649][accelerate.utils.modeling][INFO] - Device 1 seems unavailable, Proceeding to check subsequent devices. Loading checkpoint shards: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 2/2 [00:00<00:00, 4.44it/s] [ISOLATED-PROCESS][2025-07-28 03:06:19,626][pytorch][INFO] - + Enabling eval mode [ISOLATED-PROCESS][2025-07-28 03:06:19,644][pytorch][INFO] - + Cleaning up backend temporary directory [ISOLATED-PROCESS][2025-07-28 03:06:19,649][inference][INFO] - + Preparing inputs for backend pytorch [ISOLATED-PROCESS][2025-07-28 03:06:19,650][inference][INFO] - + Warming up backend for Text Generation [ISOLATED-PROCESS][2025-07-28 03:06:19,698][process][ERROR] - + Sending traceback to main process [ISOLATED-PROCESS][2025-07-28 03:06:19,710][process][INFO] - + Exiting isolated process Exception in thread Thread-3: Traceback (most recent call last): File "/usr/local/python3.10.17/lib/python3.10/threading.py", line 1016, in _bootstrap_inner [2025-07-28 03:06:20,010][process][ERROR] - + Received traceback from isolated process Traceback (most recent call last): File "/usr/local/python3.10.17/lib/python3.10/pdb.py", line 1723, in main pdb._runscript(mainpyfile) File "/usr/local/python3.10.17/lib/python3.10/pdb.py", line 1583, in _runscript self.run(statement) File "/usr/local/python3.10.17/lib/python3.10/bdb.py", line 598, in run exec(cmd, globals, locals) File "<string>", line 1, in <module> File "/models/z50051264/bitsandbytes-main/benchmarking/inference_benchmark.py", line 137, in <module> benchmark_report = Benchmark.launch(benchmark_config) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/benchmark/base.py", line 51, in launch report = launcher.launch(worker=Benchmark.run, worker_args=[config]) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/launchers/process/launcher.py", line 66, in launch raise ChildProcessError(response["traceback"]) ChildProcessError: Traceback (most recent call last): File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/launchers/process/launcher.py", line 103, in target report = worker(*worker_args) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/benchmark/base.py", line 78, in run report = scenario.run(backend) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/scenarios/inference/scenario.py", line 136, in run self.warmup_text_generation() File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/scenarios/inference/scenario.py", line 196, in warmup_text_generation self.backend.generate(self.inputs, self.config.generate_kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/backends/pytorch/backend.py", line 446, in generate return self.pretrained_model.generate(**inputs, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/generation/utils.py", line 2597, in generate result = self._sample( File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/generation/utils.py", line 3557, in _sample outputs = self(**model_inputs, return_dict=True) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/utils/generic.py", line 969, in wrapper output = func(self, *args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 703, in forward outputs: BaseModelOutputWithPast = self.model( File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/utils/generic.py", line 969, in wrapper output = func(self, *args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 436, in forward layer_outputs = decoder_layer( File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/modeling_layers.py", line 48, in __call__ return super().__call__(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 257, in forward hidden_states, self_attn_weights = self.self_attn( File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/transformers/models/qwen2/modeling_qwen2.py", line 159, in forward query_states = self.q_proj(hidden_states).view(hidden_shape).transpose(1, 2) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1736, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl return forward_call(*args, **kwargs) File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/nn/modules.py", line 271, in forward out = bnb.matmul_4bit(x, self.weight.t(), bias=bias, quant_state=self.weight.quant_state) File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/autograd/_functions.py", line 250, in matmul_4bit return MatMul4Bit.apply(A, B, out, bias, quant_state) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch/autograd/function.py", line 575, in apply return super().apply(*args, **kwargs) # type: ignore[misc] File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/autograd/_functions.py", line 197, in forward output = torch.matmul(A, F.dequantize_4bit(B, quant_state).to(A.dtype).t()) File "/usr/local/python3.10.17/lib/python3.10/site-packages/bitsandbytes-0.43.2.dev20241112-py3.10.egg/bitsandbytes/functional.py", line 345, in dequantize_4bit torch.npu.set_device(A.device) # reset context File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch_npu/npu/utils.py", line 78, in set_device device_id = _get_device_index(device, optional=True) File "/usr/local/python3.10.17/lib/python3.10/site-packages/torch_npu/npu/utils.py", line 184, in _get_device_index raise ValueError('Expected a npu device, but got: {}'.format(device) + pta_error(ErrCode.VALUE)) ValueError: Expected a npu device, but got: cpu [ERROR] 2025-07-28-03:06:19 (PID:2091, Device:0, RankID:-1) ERR00003 PTA invalid value Uncaught exception. Entering post mortem debugging Running 'cont' or 'step' will restart the program > /usr/local/python3.10.17/lib/python3.10/site-packages/optimum_benchmark/launchers/process/launcher.py(66)launch() -> raise ChildProcessError(response["traceback"]) (Pdb)
07-29
评论 1
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值