3. CREATING OBJECTS, GUIDES &LAYERS

本文介绍了图形设计软件中的一些实用快捷操作,包括标尺显示、引导线创建与管理、图形绘制技巧等,帮助设计师提高工作效率。

1、view-rules-show rules可以在工作区显视标尺(ctr+click)。在标尺中用鼠标拖放到artboard中可以生成guides。点击view-guides-lock guides可以锁定已生成的guides。

 

2、按下shift键从rule中拖动guide, guide会粘到相近的单位标尺中。

 

3、选中guide后,按backspace键可以删除。

 

4、右击标尺可以改变标尺的单位及工具区的显视单位。

 

5、选中guide后,回车即可手动设置guide的位置。

 

6、edit-preference中可以设置guide和grid的相关参数。

 

7、画圆时,按shift拖放是画正圆,再加上alt就是从圆心开始画。

 

8、画星形时,按ctrl键会使星形变瘦,然后按上/下键会增加或减小角的数量。

 

9、画图形时,图形会放在当前图层中,在工作区中选中某个图形,在图层面版中,该图形所在的图层的右边会显视有一个颜色点,拖放这个颜色点可以改变图形所在的图层。

 

10、object-path-split object into可以对图层进行分割。

(llava) zl_swuer@Zhengli:~/SQ-LLaVA-main$ pip install flash-attn --no-build-isolation -i https://pypi.tuna.tsinghua.edu.cn/simple --trusted-host pypi.tuna.tsinghua.edu.cn Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting flash-attn Downloading https://pypi.tuna.tsinghua.edu.cn/packages/11/34/9bf60e736ed7bbe15055ac2dab48ec67d9dbd088d2b4ae318fd77190ab4e/flash_attn-2.7.4.post1.tar.gz (6.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 18.4 MB/s eta 0:00:00 Preparing metadata (setup.py) ... done Requirement already satisfied: torch in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from flash-attn) (2.0.1+cu117) Requirement already satisfied: einops in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from flash-attn) (0.6.1) Requirement already satisfied: filelock in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from torch->flash-attn) (3.18.0) Requirement already satisfied: typing-extensions in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from torch->flash-attn) (4.13.2) Requirement already satisfied: sympy in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from torch->flash-attn) (1.14.0) Requirement already satisfied: networkx in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from torch->flash-attn) (3.4.2) Requirement already satisfied: jinja2 in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from torch->flash-attn) (3.1.6) Requirement already satisfied: triton==2.0.0 in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from torch->flash-attn) (2.0.0) Requirement already satisfied: cmake in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from triton==2.0.0->torch->flash-attn) (4.0.2) Requirement already satisfied: lit in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from triton==2.0.0->torch->flash-attn) (18.1.8) Requirement already satisfied: MarkupSafe>=2.0 in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (2.1.5) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages (from sympy->torch->flash-attn) (1.3.0) Building wheels for collected packages: flash-attn DEPRECATION: Building &#39;flash-attn&#39; using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of &#39;flash-attn&#39;. Discussion can be found at https://github.com/pypa/pip/issues/6334 Building wheel for flash-attn (setup.py) ... error error: subprocess-exited-with-error × python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [267 lines of output] No CUDA runtime is found, using CUDA_HOME=&#39;/usr/local/cuda-11.7&#39; torch.__version__ = 2.0.1+cu117 /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/__init__.py:94: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. !! ******************************************************************************** Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try `pip install --use-pep517`. ******************************************************************************** !! dist.fetch_build_eggs(dist.setup_requires) /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: BSD License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running bdist_wheel Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu11torch2.0cxx11abiFALSE-cp310-cp310-linux_x86_64.whl Precompiled wheel not found. Building from source... running build running build_py creating build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/fused_softmax.py -> build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/flash_attn_triton.py -> build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/flash_attn_interface.py -> build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/bert_padding.py -> build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/flash_attn_triton_og.py -> build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/flash_blocksparse_attn_interface.py -> build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn copying flash_attn/flash_blocksparse_attention.py -> build/lib.linux-x86_64-cpython-310/flash_attn creating build/lib.linux-x86_64-cpython-310/hopper copying hopper/generate_kernels.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/test_attn_kvcache.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/test_kvcache.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/padding.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/flash_attn_interface.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/benchmark_attn.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/setup.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/test_flash_attn.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/__init__.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/benchmark_split_kv.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/benchmark_flash_attention_fp8.py -> build/lib.linux-x86_64-cpython-310/hopper copying hopper/test_util.py -> build/lib.linux-x86_64-cpython-310/hopper creating build/lib.linux-x86_64-cpython-310/flash_attn/layers copying flash_attn/layers/patch_embed.py -> build/lib.linux-x86_64-cpython-310/flash_attn/layers copying flash_attn/layers/rotary.py -> build/lib.linux-x86_64-cpython-310/flash_attn/layers copying flash_attn/layers/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/layers creating build/lib.linux-x86_64-cpython-310/flash_attn/utils copying flash_attn/utils/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/utils copying flash_attn/utils/distributed.py -> build/lib.linux-x86_64-cpython-310/flash_attn/utils copying flash_attn/utils/benchmark.py -> build/lib.linux-x86_64-cpython-310/flash_attn/utils copying flash_attn/utils/generation.py -> build/lib.linux-x86_64-cpython-310/flash_attn/utils copying flash_attn/utils/pretrained.py -> build/lib.linux-x86_64-cpython-310/flash_attn/utils creating build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/fwd_decode.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/test.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/bench.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/utils.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/interface_torch.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/bwd_prefill.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/interface_fa.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/fwd_prefill.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/bwd_ref.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd copying flash_attn/flash_attn_triton_amd/fwd_ref.py -> build/lib.linux-x86_64-cpython-310/flash_attn/flash_attn_triton_amd creating build/lib.linux-x86_64-cpython-310/flash_attn/losses copying flash_attn/losses/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/losses copying flash_attn/losses/cross_entropy.py -> build/lib.linux-x86_64-cpython-310/flash_attn/losses creating build/lib.linux-x86_64-cpython-310/flash_attn/ops copying flash_attn/ops/activations.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops copying flash_attn/ops/layer_norm.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops copying flash_attn/ops/rms_norm.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops copying flash_attn/ops/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops copying flash_attn/ops/fused_dense.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops creating build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/falcon.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/bert.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/opt.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/vit.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/bigcode.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/gpt_neox.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/gpt.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/baichuan.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/llama.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/gptj.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models copying flash_attn/models/btlm.py -> build/lib.linux-x86_64-cpython-310/flash_attn/models creating build/lib.linux-x86_64-cpython-310/flash_attn/modules copying flash_attn/modules/block.py -> build/lib.linux-x86_64-cpython-310/flash_attn/modules copying flash_attn/modules/embedding.py -> build/lib.linux-x86_64-cpython-310/flash_attn/modules copying flash_attn/modules/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/modules copying flash_attn/modules/mha.py -> build/lib.linux-x86_64-cpython-310/flash_attn/modules copying flash_attn/modules/mlp.py -> build/lib.linux-x86_64-cpython-310/flash_attn/modules creating build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton copying flash_attn/ops/triton/layer_norm.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton copying flash_attn/ops/triton/linear.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton copying flash_attn/ops/triton/k_activations.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton copying flash_attn/ops/triton/rotary.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton copying flash_attn/ops/triton/__init__.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton copying flash_attn/ops/triton/mlp.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton copying flash_attn/ops/triton/cross_entropy.py -> build/lib.linux-x86_64-cpython-310/flash_attn/ops/triton running build_ext building &#39;flash_attn_2_cuda&#39; extension creating /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/csrc/flash_attn creating /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/csrc/flash_attn/src Emitting ninja build file /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/build.ninja... Compiling objects... Using envvar MAX_JOBS (1) as the number of workers... [1/85] c++ -MMD -MF /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/csrc/flash_attn/flash_api.o.d -pthread -B /home/zl_swuer/anaconda3/envs/llava/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /home/zl_swuer/anaconda3/envs/llava/include -fPIC -O2 -isystem /home/zl_swuer/anaconda3/envs/llava/include -fPIC -I/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn -I/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/src -I/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/cutlass/include -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/TH -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/THC -I/usr/local/cuda-11.7/include -I/home/zl_swuer/anaconda3/envs/llava/include/python3.10 -c -c /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp -o /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/csrc/flash_attn/flash_api.o -O3 -std=c++17 -DTORCH_API_INCLUDE_EXTENSION_H &#39;-DPYBIND11_COMPILER_TYPE="_gcc"&#39; &#39;-DPYBIND11_STDLIB="_libstdcpp"&#39; &#39;-DPYBIND11_BUILD_ABI="_cxxabi1011"&#39; -DTORCH_EXTENSION_NAME=flash_attn_2_cuda -D_GLIBCXX_USE_CXX11_ABI=0 FAILED: /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/csrc/flash_attn/flash_api.o c++ -MMD -MF /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/csrc/flash_attn/flash_api.o.d -pthread -B /home/zl_swuer/anaconda3/envs/llava/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /home/zl_swuer/anaconda3/envs/llava/include -fPIC -O2 -isystem /home/zl_swuer/anaconda3/envs/llava/include -fPIC -I/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn -I/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/src -I/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/cutlass/include -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/TH -I/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/THC -I/usr/local/cuda-11.7/include -I/home/zl_swuer/anaconda3/envs/llava/include/python3.10 -c -c /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp -o /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/build/temp.linux-x86_64-cpython-310/csrc/flash_attn/flash_api.o -O3 -std=c++17 -DTORCH_API_INCLUDE_EXTENSION_H &#39;-DPYBIND11_COMPILER_TYPE="_gcc"&#39; &#39;-DPYBIND11_STDLIB="_libstdcpp"&#39; &#39;-DPYBIND11_BUILD_ABI="_cxxabi1011"&#39; -DTORCH_EXTENSION_NAME=flash_attn_2_cuda -D_GLIBCXX_USE_CXX11_ABI=0 /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp: In function ‘std::vector<at::Tensor> flash::mha_fwd(at::Tensor&, const at::Tensor&, const at::Tensor&, std::optional<at::Tensor>&, std::optional<at::Tensor>&, float, float, bool, int, int, float, bool, std::optional<at::Generator>)’: /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:489:13: error: invalid initialization of reference of type ‘const c10::optional<at::Generator>&’ from expression of type ‘std::optional<at::Generator>’ 489 | gen_, at::cuda::detail::getDefaultCUDAGenerator()); | ^~~~ In file included from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/DeprecatedTypeProperties.h:9, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:32, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Tensor.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/utils/variadic.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/detail/static.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/python.h:3, from /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:6: /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Generator.h:160:75: note: in passing argument 1 of ‘T* at::get_generator_or_default(const c10::optional<at::Generator>&, const at::Generator&) [with T = at::CUDAGeneratorImpl]’ 160 | static inline T* get_generator_or_default(const c10::optional<Generator>& gen, const Generator& default_gen) { | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~ /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp: In function ‘std::vector<at::Tensor> flash::mha_varlen_fwd(at::Tensor&, const at::Tensor&, const at::Tensor&, std::optional<at::Tensor>&, const at::Tensor&, const at::Tensor&, std::optional<at::Tensor>&, std::optional<const at::Tensor>&, std::optional<at::Tensor>&, std::optional<at::Tensor>&, int, int, float, float, bool, bool, int, int, float, bool, std::optional<at::Generator>)’: /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:729:13: error: invalid initialization of reference of type ‘const c10::optional<at::Generator>&’ from expression of type ‘std::optional<at::Generator>’ 729 | gen_, at::cuda::detail::getDefaultCUDAGenerator()); | ^~~~ In file included from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/DeprecatedTypeProperties.h:9, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:32, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Tensor.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/utils/variadic.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/detail/static.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/python.h:3, from /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:6: /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Generator.h:160:75: note: in passing argument 1 of ‘T* at::get_generator_or_default(const c10::optional<at::Generator>&, const at::Generator&) [with T = at::CUDAGeneratorImpl]’ 160 | static inline T* get_generator_or_default(const c10::optional<Generator>& gen, const Generator& default_gen) { | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~ /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp: In function ‘std::vector<at::Tensor> flash::mha_bwd(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, std::optional<at::Tensor>&, std::optional<at::Tensor>&, std::optional<at::Tensor>&, std::optional<at::Tensor>&, float, float, bool, int, int, float, bool, std::optional<at::Generator>, std::optional<at::Tensor>&)’: /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:937:9: error: invalid initialization of reference of type ‘const c10::optional<at::Generator>&’ from expression of type ‘std::optional<at::Generator>’ 937 | gen_, at::cuda::detail::getDefaultCUDAGenerator()); | ^~~~ In file included from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/DeprecatedTypeProperties.h:9, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:32, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Tensor.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/utils/variadic.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/detail/static.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/python.h:3, from /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:6: /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Generator.h:160:75: note: in passing argument 1 of ‘T* at::get_generator_or_default(const c10::optional<at::Generator>&, const at::Generator&) [with T = at::CUDAGeneratorImpl]’ 160 | static inline T* get_generator_or_default(const c10::optional<Generator>& gen, const Generator& default_gen) { | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~ /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp: In function ‘std::vector<at::Tensor> flash::mha_varlen_bwd(const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, const at::Tensor&, std::optional<at::Tensor>&, std::optional<at::Tensor>&, std::optional<at::Tensor>&, const at::Tensor&, const at::Tensor&, std::optional<at::Tensor>&, int, int, float, float, bool, bool, int, int, float, bool, std::optional<at::Generator>, std::optional<at::Tensor>&)’: /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:1166:9: error: invalid initialization of reference of type ‘const c10::optional<at::Generator>&’ from expression of type ‘std::optional<at::Generator>’ 1166 | gen_, at::cuda::detail::getDefaultCUDAGenerator()); | ^~~~ In file included from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/DeprecatedTypeProperties.h:9, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/TensorBody.h:32, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Tensor.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/utils/variadic.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/detail/static.h:3, from /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/torch/csrc/api/include/torch/python.h:3, from /tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/csrc/flash_attn/flash_api.cpp:6: /home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/include/ATen/core/Generator.h:160:75: note: in passing argument 1 of ‘T* at::get_generator_or_default(const c10::optional<at::Generator>&, const at::Generator&) [with T = at::CUDAGeneratorImpl]’ 160 | static inline T* get_generator_or_default(const c10::optional<Generator>& gen, const Generator& default_gen) { | ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~ ninja: build stopped: subcommand failed. Traceback (most recent call last): File "/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/setup.py", line 486, in run urllib.request.urlretrieve(wheel_url, wheel_filename) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/urllib/request.py", line 241, in urlretrieve with contextlib.closing(urlopen(url, data)) as fp: File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/urllib/request.py", line 216, in urlopen return opener.open(url, data, timeout) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/urllib/request.py", line 525, in open response = meth(req, response) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/urllib/request.py", line 634, in http_response response = self.parent.error( File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/urllib/request.py", line 563, in error return self._call_chain(*args) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/urllib/request.py", line 496, in _call_chain result = func(*args) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/urllib/request.py", line 643, in http_error_default raise HTTPError(req.full_url, code, msg, hdrs, fp) urllib.error.HTTPError: HTTP Error 404: Not Found During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1893, in _run_ninja_build subprocess.run( File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/subprocess.py", line 526, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command &#39;[&#39;ninja&#39;, &#39;-v&#39;, &#39;-j&#39;, &#39;1&#39;]&#39; returned non-zero exit status 1. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "<string>", line 2, in <module> File "<pip-setuptools-caller>", line 35, in <module> File "/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/setup.py", line 526, in <module> setup( File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/__init__.py", line 117, in setup return distutils.core.setup(**attrs) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 186, in setup return run_commands(dist) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/core.py", line 202, in run_commands dist.run_commands() File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 1002, in run_commands self.run_command(cmd) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/dist.py", line 1104, in run_command super().run_command(command) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 1021, in run_command cmd_obj.run() File "/tmp/pip-install-28j6lr6z/flash-attn_e66b5f3aad0d437b87660e7979fe81be/setup.py", line 503, in run super().run() File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/command/bdist_wheel.py", line 370, in run self.run_command("build") File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 357, in run_command self.distribution.run_command(command) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/dist.py", line 1104, in run_command super().run_command(command) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 1021, in run_command cmd_obj.run() File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/command/build.py", line 135, in run self.run_command(cmd_name) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/cmd.py", line 357, in run_command self.distribution.run_command(command) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/dist.py", line 1104, in run_command super().run_command(command) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/dist.py", line 1021, in run_command cmd_obj.run() File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/command/build_ext.py", line 99, in run _build_ext.run(self) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 368, in run self.build_extensions() File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 843, in build_extensions build_ext.build_extensions(self) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 484, in build_extensions self._build_extensions_serial() File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 510, in _build_extensions_serial self.build_extension(ext) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/command/build_ext.py", line 264, in build_extension _build_ext.build_extension(self, ext) File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/setuptools/_distutils/command/build_ext.py", line 565, in build_extension objects = self.compiler.compile( File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 658, in unix_wrap_ninja_compile _write_ninja_file_and_compile_objects( File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1574, in _write_ninja_file_and_compile_objects _run_ninja_build( File "/home/zl_swuer/anaconda3/envs/llava/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1909, in _run_ninja_build raise RuntimeError(message) from e RuntimeError: Error compiling objects for extension [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)
05-21
【SCI一区复现】基于配电网韧性提升的应急移动电源预配置和动态调度(下)—MPS动态调度(Matlab代码实现)内容概要:本文档围绕“基于配电网韧性提升的应急移动电源预配置和动态调度”主题,重点介绍MPS(Mobile Power Sources)动态调度的Matlab代码实现,是SCI一区论文复现的技术资料。内容涵盖在灾害或故障等极端场景下,如何通过优化算法对应急移动电源进行科学调度,以提升配电网在突发事件中的恢复能力与供电可靠性。文档强调采用先进的智能优化算法进行建模求解,并结合IEEE标准测试系统(如IEEE33节点)进行仿真验证,具有较强的学术前沿性和工程应用价值。; 适合人群:具备电力系统基础知识和Matlab编程能力,从事电力系统优化、配电网韧性、应急电源调度等相关领域研究的研究生、科研人员及工程技术人员。; 使用场景及目标:①用于复现高水平期刊(SCI一区、IEEE顶刊)中关于配电网韧性与移动电源调度的研究成果;②支撑科研项目中的模型构建与算法开发,提升配电网在故障后的快速恢复能力;③为电力系统应急调度策略提供仿真工具与技术参考。; 阅读建议:建议结合前篇“MPS预配置”内容系统学习,重点关注动态调度模型的数学建模、目标函数设计与Matlab代码实现细节,建议配合YALMIP等优化工具包进行仿真实验,并参考文中提供的网盘资源获取完整代码与数据。
一款AI短视频生成工具,只需输入一句产品卖点或内容主题,软件便能自动生成脚本、配音、字幕和特效,并在30秒内渲染出成片。 支持批量自动剪辑,能够实现无人值守的循环生产。 一键生成产品营销与泛内容短视频,AI批量自动剪辑,高颜值跨平台桌面端工具。 AI视频生成工具是一个桌面端应用,旨在通过AI技术简化短视频的制作流程。用户可以通过简单的提示词文本+视频分镜素材,快速且自动的剪辑出高质量的产品营销和泛内容短视频。该项目集成了AI驱动的文案生成、语音合成、视频剪辑、字幕特效等功能,旨在为用户提供开箱即用的短视频制作体验。 核心功能 AI驱动:集成了最新的AI技术,提升视频制作效率和质量 文案生成:基于提示词生成高质量的短视频文案 自动剪辑:支持多种视频格式,自动化批量处理视频剪辑任务 语音合成:将生成的文案转换为自然流畅的语音 字幕特效:自动添加字幕和特效,提升视频质量 批量处理:支持批量任务,按预设自动持续合成视频 多语言支持:支持中文、英文等多种语言,满足不同用户需求 开箱即用:无需复杂配置,用户可以快速上手 持续更新:定期发布新版本,修复bug并添加新功能 安全可靠:完全本地本地化运行,确保用户数据安全 用户友好:简洁直观的用户界面,易于操作 多平台支持:支持Windows、macOS和Linux等多个操作系统
源码来自:https://pan.quark.cn/s/2bb27108fef8 **MetaTrader 5的智能交易系统(EA)**MetaTrader 5(MT5)是由MetaQuotes Software Corp公司研发的一款广受欢迎的外汇交易及金融市场分析软件。 该平台具备高级图表、技术分析工具、自动化交易(借助EA,即Expert Advisor)以及算法交易等多项功能,使交易参与者能够高效且智能化地开展市场活动。 **抛物线SAR(Parabolic SAR)技术指标**抛物线SAR(Stop and Reverse)是由技术分析专家Wells Wilder所设计的一种趋势追踪工具,其目的在于识别价格走势的变动并设定止损及止盈界限。 SAR值的计算依赖于当前价格与前一个周期的SAR数值,随着价格的上扬或下滑,SAR会以一定的加速系数逐渐靠近价格轨迹,一旦价格走势发生逆转,SAR也会迅速调整方向,从而发出交易提示。 **Parabolic SAR EA的操作原理**在MetaTrader 5环境中,Parabolic SAR EA借助内嵌的iSAR工具来执行交易决策。 iSAR工具通过计算得出的SAR位置,辅助EA判断入市与离市时机。 当市场价位触及SAR点时,EA将产生开仓指令,倘若价格持续朝同一方向变动,SAR将同步移动,形成动态止损与止盈参考点。 当价格反向突破SAR时,EA会结束当前仓位并可能建立反向仓位。 **智能交易系统(EA)的优越性**1. **自动化交易**:EA能够持续监控市场,依据既定策略自动完成买卖操作,减少人为情感对交易的影响。 2. **精确操作**:EA依照预设规则操作,无任何迟疑,从而提升交易成效。 3. **风险管控**:借助SA...
e:\Python311\python11\Lib\site-packages\tqdm\auto.py:21: TqdmWarning: IProgress not found. Please update jupyter and ipywidgets. See https://ipywidgets.readthedocs.io/en/stable/user_install.html from .autonotebook import tqdm as notebook_tqdm Skipping import of cpp extensions due to incompatible torch version 2.9.1+cpu for torchao version 0.14.1 Please see https://github.com/pytorch/ao/issues/2919 for more info W1124 20:12:36.643000 9264 site-packages\torch\distributed\elastic\multiprocessing\redirects.py:29] NOTE: Redirects are currently not supported in Windows or MacOs. Downloading Model from https://www.modelscope.cn to directory: C:\Users\13636\.cache\modelscope\hub\models\qwen\Qwen3-8B Downloading Model from https://www.modelscope.cn to directory: C:\Users\13636\.cache\modelscope\hub\models\qwen\Qwen3-8B Loading checkpoint shards: 100%|██████████| 5/5 [00:05<00:00, 1.04s/it] Some parameters are on the meta device because they were offloaded to the disk and cpu. Parameter base_model.model.model.layers.0.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.0.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.0.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.0.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.0.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.0.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.0.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.1.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.1.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.1.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.1.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.1.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.1.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.2.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.2.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.2.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.2.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.2.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.2.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.3.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.3.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.3.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.3.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.3.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.3.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.4.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.4.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.4.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.4.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.4.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.4.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.5.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.5.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.5.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.5.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.5.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.5.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.6.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.6.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.6.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.6.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.6.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.6.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.7.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.7.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.7.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.7.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.7.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.7.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.8.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.8.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.8.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.8.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.8.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.8.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.9.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.9.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.9.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.9.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.9.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.9.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.10.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.10.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.10.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.10.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.10.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.10.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.11.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.11.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.11.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.11.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.11.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.11.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.12.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.12.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.12.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.12.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.12.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.12.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.13.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.13.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.13.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.13.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.13.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.13.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.14.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.14.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.14.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.14.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.14.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.14.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.15.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.15.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.15.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.15.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.15.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.15.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.16.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.16.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.16.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.16.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.16.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.16.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.17.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.17.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.17.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.17.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.17.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.17.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.18.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.18.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.18.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.18.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.18.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.18.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.19.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.19.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.19.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.19.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.19.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.19.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.20.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.20.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.20.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.20.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.20.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.20.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.21.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.21.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.21.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.21.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.21.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.21.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.22.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.22.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.22.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.22.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.22.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.22.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.23.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.23.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.23.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.23.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.23.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.23.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.24.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.24.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.24.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.24.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.24.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.24.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.25.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.25.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.25.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.25.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.25.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.25.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.26.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.26.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.26.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.26.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.26.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.26.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.27.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.27.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.27.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.27.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.27.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.27.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.28.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.28.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.28.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.28.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.28.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.28.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.29.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.29.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.29.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.29.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.29.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.29.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.30.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.30.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.30.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.30.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.30.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.30.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.31.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.31.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.31.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.31.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.31.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.31.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.32.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.32.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.32.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.32.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.32.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.32.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.33.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.33.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.33.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.33.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.33.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.33.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.34.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.34.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.34.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.34.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.34.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.34.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.q_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.q_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.q_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.k_proj.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.v_proj.base_layer.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.v_proj.lora_A.default.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.v_proj.lora_B.default.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.o_proj.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.q_norm.weight is still on meta device! Parameter base_model.model.model.layers.35.self_attn.k_norm.weight is still on meta device! Parameter base_model.model.model.layers.35.mlp.gate_proj.weight is still on meta device! Parameter base_model.model.model.layers.35.mlp.up_proj.weight is still on meta device! Parameter base_model.model.model.layers.35.mlp.down_proj.weight is still on meta device! Parameter base_model.model.model.layers.35.input_layernorm.weight is still on meta device! Parameter base_model.model.model.layers.35.post_attention_layernorm.weight is still on meta device! Parameter base_model.model.model.norm.weight is still on meta device! Parameter base_model.model.lm_head.weight is still on meta device!
11-25
你看到的这些输出: ``` Parameter base_model.model.model.layers.0.self_attn.q_proj.base_layer.weight is still on meta device! ... Parameter base_model.model.lm_head.weight is still on meta device! ``` 说明:**你的模型虽然已经加载了结构,但所有参数都还停留在 `meta` 设备上,没有真正加载进内存或 GPU。** 这是一个 **严重问题** —— 模型无法进行推理或训练! --- ## 🔍 问题根源分析 你使用的是 **LoRA 微调后的 Qwen3-8B 模型**(从参数名如 `.lora_A.default.weight` 可知),并且是通过 `modelscope` 加载的。 但是,你在加载时可能犯了一个关键错误: > ❌ 错误地先加载了基础模型,再试图应用 LoRA,而没有正确恢复 PEFT 结构。 或者更常见的情况是: > ✅ 你下载的是一个 **PEFT 格式的微调模型(即 LoRA 适配器)**,但你用 `AutoModelForCausalLM.from_pretrained()` 直接加载它,这会导致: > > - 模型结构被识别 > - 但权重以“延迟初始化”方式创建在 `meta` 设备上 > - 实际权重未加载 → 所有参数都在 `meta` 上 --- ## ✅ 正确解决方案:使用 `PeftModel.from_pretrained` 你需要明确告诉系统:**这是一个 PEFT 模型(LoRA 适配器)**,而不是完整的 CausalLM 模型。 ### ✅ 正确代码如下: ```python from modelscope import AutoTokenizer, AutoModelForCausalLM from peft import PeftModel, PeftConfig import torch # 1. 加载 tokenizer 和 base model(基础模型) model_name = "qwen/Qwen3-8B" # 基础模型名称(如果知道) # 如果你不知道 base model 是什么,可以查看 adapter_config.json peft_model_path = "./your_lora_adapter_path" # 替换为你的实际路径 # 2. 使用 PeftConfig 加载配置 config = PeftConfig.from_pretrained(peft_model_path) # 3. 先加载 base model base_model = AutoModelForCausalLM.from_pretrained( config.base_model_name_or_path, # 自动获取 base model 名称 torch_dtype=torch.bfloat16, device_map="auto", trust_remote_code=True ) tokenizer = AutoTokenizer.from_pretrained( config.base_model_name_or_path, trust_remote_code=True ) # 4. 将 LoRA 适配器加载到 base model 上 model = PeftModel.from_pretrained( base_model, peft_model_path, device_map="auto", trust_remote_code=True ) # 5. 可选:合并 LoRA 权重到主模型(提升推理速度) model = model.merge_and_unload() # 6. 测试是否正常 input_text = "你好,请介绍一下你自己" inputs = tokenizer(input_text, return_tensors="pt").to("cuda") with torch.no_grad(): outputs = model.generate(**inputs, max_new_tokens=100) print(tokenizer.decode(outputs[0], skip_special_tokens=True)) ``` --- ## 📁 文件结构要求 确保你的微调输出目录包含以下文件: ``` your_lora_adapter_path/ ├── adapter_config.json ← 必须存在 ├── adapter_model.bin ← 或 .safetensors ├── README.md (optional) └── special_tokens_map.json (可选) ``` 其中 `adapter_config.json` 中会定义: ```json { "base_model_name_or_path": "Qwen/Qwen3-8B", "peft_type": "LORA", ... } ``` 这个字段决定了你应该加载哪个 base model。 --- ## ⚠️ 常见错误场景 | 错误做法 | 后果 | |--------|------| | `AutoModelForCausalLM.from_pretrained("./lora_adapter")` | 所有参数在 `meta` 设备 | | 忘记设置 `trust_remote_code=True` | 报错找不到模型类 | | base model 路径错误 | 权重形状不匹配 | --- ## ✅ 如何验证是否成功加载? 运行这段代码检查是否有参数仍在 `meta`: ```python for name, param in model.named_parameters(): if param.device == torch.device("meta"): print(f"[ERROR] {name} is on meta device!") else: print(f"[OK] {name} loaded on {param.device}") ``` ✅ 成功后应无任何输出在 `meta` 上。 --- ## 💡 提示:如何加速和节省资源? ### 方案 1:量化加载(如果你显存不足) ```python base_model = AutoModelForCausalLM.from_pretrained( config.base_model_name_or_path, torch_dtype=torch.bfloat16, device_map="auto", trust_remote_code=True, load_in_4bit=True, # 或 load_in_8bit=True ) ``` 需要安装: ```bash pip install bitsandbytes accelerate ``` ### 方案 2:只推理时不合并 LoRA ```python # 不 merge,保留 LoRA 结构 output = model.generate(**inputs, max_new_tokens=100) ``` 优点:节省显存;缺点:稍慢。 --- ## ✅ 总结:为什么会出现“all parameters on meta”? | 原因 | 解决方法 | |------|----------| | 用 `AutoModelForCausalLM` 加载 PEFT 模型 | 改用 `PeftModel.from_pretrained` | | 缺少 `adapter_config.json` | 确保文件完整 | | base model 路径不对 | 检查 `config.base_model_name_or_path` | | 未启用 `trust_remote_code` | 添加该参数 | --- ###
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值