安装llama-cpp-python踩坑记

1. 安装llama-cpp-python报错

$ CMAKE_ARGS="-DLLAMA_CUBLAS=on" pip install llama-cpp-python -i https://pypi.tuna.tsinghua.edu.cn/simple
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple
Collecting llama-cpp-python
  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/de/6d/4a20e676bdf7d9d3523be3a081bf327af958f9bdfe2a564f5cf485faeaec/llama_cpp_python-0.3.9.tar.gz (67.9 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 67.9/67.9 MB 5.3 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Installing backend dependencies ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: typing-extensions>=4.5.0 in /home/wuwenliang/anaconda3/envs/llmtuner/lib/python3.10/site-packages (from llama-cpp-python) (4.13.2)
Requirement already satisfied: numpy>=1.20.0 in /home/wuwenliang/anaconda3/envs/llmtuner/lib/python3.10/site-packages (from llama-cpp-python) (1.26.4)
Collecting diskcache>=5.6.1 (from llama-cpp-python)
  Downloading https://pypi.tuna.tsinghua.edu.cn/packages/3f/27/4570e78fc0bf5ea0ca45eb1de3818a23787af9b390c0b0a0033a1b8236f9/diskcache-5.6.3-py3-none-any.whl (45 kB)
Requirement already satisfied: jinja2>=2.11.3 in /home/wuwenliang/anaconda3/envs/llmtuner/lib/python3.10/site-packages (from llama-cpp-python) (3.1.6)
Requirement already satisfied: MarkupSafe>=2.0 in /home/wuwenliang/anaconda3/envs/llmtuner/lib/python3.10/site-packages (from jinja2>=2.11.3->llama-cpp-python) (3.0.2)
Building wheels for collected packages: llama-cpp-python
  Building wheel for llama-cpp-python (pyproject.toml) ... error
  error: subprocess-exited-with-error
  
  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> [29 lines of output]
      *** scikit-build-core 0.11.5 using CMake 3.22.1 (wheel)
      *** Configuring CMake...
      loading initial cache file /tmp/tmp01d6kko6/build/CMakeInit.txt
      -- The C compiler identification is GNU 11.2.0
      -- The CXX compiler identification is GNU 11.2.0
      -- Detecting C compiler ABI info
      -- Detecting C compiler ABI info - done
      -- Check for working C compiler: /usr/bin/gcc - skipped
      -- Detecting C compile features
      -- Detecting C compile features - done
      -- Detecting CXX compiler ABI info
      -- Detecting CXX compiler ABI info - done
      -- Check for working CXX compiler: /usr/bin/g++ - skipped
      -- Detecting CXX compile features
      -- Detecting CXX compile features - done
      -- Found Git: /usr/bin/git (found version "2.34.1")
      CMake Error at vendor/llama.cpp/CMakeLists.txt:108 (message):
        LLAMA_CUBLAS is deprecated and will be removed in the future.
      
        Use GGML_CUDA instead
      
      Call Stack (most recent call first):
        vendor/llama.cpp/CMakeLists.txt:113 (llama_option_depr)
      
      
      -- Configuring incomplete, errors occurred!
      See also "/tmp/tmp01d6kko6/build/CMakeFiles/CMakeOutput.log".
      
      *** CMake configuration failed
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Failed to build installable wheels for some pyproject.toml based projects (llama-cpp-python)

安装报错,分析如下:

这个错误是因为 LLAMA_CUBLAS 选项已经被弃用,建议使用 GGML_CUDA 替代。你需要修改安装命令中的 CMake 参数。

完整安装命令(包含更多可能的依赖):

CMAKE_ARGS="-DGGML_CUDA=on -DCMAKE_CUDA_COMPILER=/usr/local/cuda/bin/nvcc" pip install llam 

安装仍旧失败:

  collect2: error: ld returned 1 exit status
  ninja: build stopped: subcommand failed.

  *** CMake build failed
  error: subprocess-exited-with-error
  
  × Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
  │ exit code: 1
  ╰─> See above for output.
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  full command: /llmtuner/bin/python /llmtuner/li
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值