大语言模型(LLM)凤凰 ,训练过程中报错(安装flash_attn过程中报错:No module named ‘torch‘)

在安装大语言模型(LLM)相关库flash_attn时遇到ModuleNotFoundError: No module named 'torch'的问题。通过conda安装pytorch后,成功解决报错,安装了flash_attn的1.0.5版本。注意,CUDA版本需为11.6,否则可能引发错误。
部署运行你感兴趣的模型镜像

安装flash_attn报错,信息如下:

pip install flash_attn
Collecting flash_attn
  Using cached flash_attn-1.0.8.tar.gz (2.0 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [15 lines of output]
      Traceback (most recent call last):
        File "/home/aaa/anaconda3/envs/fenghuang/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/home/aaa/anaconda3/envs/fenghuang/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/home/aaa/anaconda3/envs/fenghuang/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/tmp/pip-build-env-tamzhq3w/overlay/lib/python3.9/site-packages/setuptools/build_meta.py", line 341, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
        File "/tmp/pip-build-env-tamzhq3w/overlay/lib/python3.9/site-packages/setuptools/build_meta.py", line 323, in _get_build_requires
          self.run_setup()
        File "/tmp/pip-build-env-tamzhq3w/overlay/lib/python3.9/site-packages/setuptools/build_meta.py", line 338, in run_setup
          exec(code, locals())
        File "<string>", line 13, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
 

尝试多种方案,均无效。后解决该问题。思路如下:

1.安装pytorch

conda install pytorch torchvision torchaudio -c pytorch

2.安装flash_attn

pip install flash_attn==1.0.5

注意事项: cuda采用 11.6版本没问题。如果使用12.0,可能会报错(更改软链接为11.6或11.7)

pip install flash_attn==1.0.5
Collecting flash_attn==1.0.5
  Using cached flash_attn-1.0.5.tar.gz (2.0 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... done
  Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: einops in ./fenghuang/lib/python3.9/site-packages (from flash_attn==1.0.5) (0.6.1)
Requirement already satisfied: packaging in ./fenghuang/lib/python3.9/site-packages (from flash_attn==1.0.5) (23.1)
Requirement already satisfied: torch in ./fenghuang/lib/python3.9/site-packages (from flash_attn==1.0.5) (2.0.1)
Collecting ninja
  Using cached ninja-1.11.1-py2.py3-none-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (145 kB)
Requirement already satisfied: filelock in ./fenghuang/lib/python3.9/site-packages (from torch->flash_attn==1.0.5) (3.12.0)
Requirement already satisfied: typing-extensions in ./fenghuang/lib/python3.9/site-packages (from torch->flash_attn==1.0.5) (4.6.3)
Requirement already satisfied: sympy in ./fenghuang/lib/python3.9/site-packages (from torch->flash_attn==1.0.5) (1.12)
Requirement already satisfied: networkx in ./fenghuang/lib/python3.9/site-packages (from torch->flash_attn==1.0.5) (3.1)
Requirement already satisfied: jinja2 in ./fenghuang/lib/python3.9/site-packages (from torch->flash_attn==1.0.5) (3.1.2)
Requirement already satisfied: MarkupSafe>=2.0 in ./fenghuang/lib/python3.9/site-packages (from jinja2->torch->flash_attn==1.0.5) (2.1.3)
Requirement already satisfied: mpmath>=0.19 in ./fenghuang/lib/python3.9/site-packages (from sympy->torch->flash_attn==1.0.5) (1.3.0)
Building wheels for collected packages: flash_attn
  Building wheel for flash_attn (pyproject.toml) ... done
  Created wheel for flash_attn: filename=flash_attn-1.0.5-cp39-cp39-linux_x86_64.whl size=46072072 sha256=ac92e8b141cba225d099f693436af07965386fa307852a5b24ebf63110827bce
  Stored in directory: /home/aaa/.cache/pip/wheels/c3/43/d3/2243fbb153d026d8f9f498b13ce9dbbef18da5ef89fdedfed3
Successfully built flash_attn
Installing collected packages: ninja, flash_attn
Successfully installed flash_attn-1.0.5 ninja-1.11.1
 

您可能感兴趣的与本文相关的镜像

PyTorch 2.6

PyTorch 2.6

PyTorch
Cuda

PyTorch 是一个开源的 Python 机器学习库,基于 Torch 库,底层由 C++ 实现,应用于人工智能领域,如计算机视觉和自然语言处理

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

艺桥

你的鼓励将是我创作的最大动力

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值