flash attention | ERROR: Failed to build installable wheels for some pyproject.toml based projects

python version = 3.10 

conda create -n fastervlm python=3.10 -y

直接安装flash_attn会报错:

      error: command '/usr/bin/g++' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for flash-attn
  Running setup.py clean for flash-attn
Failed to build flash-attn
ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)

解决方法:

1. 找到flash_attn官方地址:

https://github.com/Dao-AILab/flash-attention/releases

2. 下载合适的版本,我下载的是:
flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

wget https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

3. 安装

pip install torch==2.2.0+cu121 torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu121
pip install flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl --no-build-isolation

flashattention2如何安装(evo1) zsy@amax-Rack-Server:/mnt/disk/zsy/projects/Evo-1/Evo_1$ MAX_JOBS=64 pip install -v flash-attn --no-build-isolation Using pip 25.3 from /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages/pip (python 3.10) Collecting flash-attn Downloading flash_attn-2.8.3.tar.gz (8.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.4/8.4 MB 24.5 MB/s 0:00:00 Running command Preparing metadata (pyproject.toml) /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: BSD License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() torch.__version__ = 2.5.1+cu124 running dist_info creating /tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info writing /tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info/PKG-INFO writing dependency_links to /tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info/dependency_links.txt writing requirements to /tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info/requires.txt writing top-level names to /tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info/top_level.txt writing manifest file '/tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info/SOURCES.txt' reading manifest file '/tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no files found matching '*.cu' under directory 'flash_attn' warning: no files found matching '*.h' under directory 'flash_attn' warning: no files found matching '*.cuh' under directory 'flash_attn' warning: no files found matching '*.cpp' under directory 'flash_attn' warning: no files found matching '*.hpp' under directory 'flash_attn' adding license file 'LICENSE' adding license file 'AUTHORS' writing manifest file '/tmp/pip-modern-metadata-tdv6he4t/flash_attn.egg-info/SOURCES.txt' creating '/tmp/pip-modern-metadata-tdv6he4t/flash_attn-2.8.3.dist-info' Preparing metadata (pyproject.toml) ... done Requirement already satisfied: torch in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from flash-attn) (2.5.1) Requirement already satisfied: einops in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from flash-attn) (0.8.1) Requirement already satisfied: filelock in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (3.20.0) Requirement already satisfied: typing-extensions>=4.8.0 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (4.15.0) Requirement already satisfied: networkx in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (3.4.2) Requirement already satisfied: jinja2 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (3.1.6) Requirement already satisfied: fsspec in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (2025.10.0) Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.4.127 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127) Requirement already satisfied: nvidia-cuda-runtime-cu12==12.4.127 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127) Requirement already satisfied: nvidia-cuda-cupti-cu12==12.4.127 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127) Requirement already satisfied: nvidia-cudnn-cu12==9.1.0.70 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (9.1.0.70) Requirement already satisfied: nvidia-cublas-cu12==12.4.5.8 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (12.4.5.8) Requirement already satisfied: nvidia-cufft-cu12==11.2.1.3 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (11.2.1.3) Requirement already satisfied: nvidia-curand-cu12==10.3.5.147 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (10.3.5.147) Requirement already satisfied: nvidia-cusolver-cu12==11.6.1.9 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (11.6.1.9) Requirement already satisfied: nvidia-cusparse-cu12==12.3.1.170 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (12.3.1.170) Requirement already satisfied: nvidia-nccl-cu12==2.21.5 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (2.21.5) Requirement already satisfied: nvidia-nvtx-cu12==12.4.127 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127) Requirement already satisfied: nvidia-nvjitlink-cu12==12.4.127 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (12.4.127) Requirement already satisfied: triton==3.1.0 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (3.1.0) Requirement already satisfied: sympy==1.13.1 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from torch->flash-attn) (1.13.1) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from sympy==1.13.1->torch->flash-attn) (1.3.0) Requirement already satisfied: MarkupSafe>=2.0 in /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (3.0.3) Building wheels for collected packages: flash-attn Running command Building wheel for flash-attn (pyproject.toml) /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: BSD License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() torch.__version__ = 2.5.1+cu124 running bdist_wheel Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.8.3/flash_attn-2.8.3+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl error: Remote end closed connection without response error: subprocess-exited-with-error × Building wheel for flash-attn (pyproject.toml) did not run successfully. │ exit code: 1 ╰─> No available output. note: This error originates from a subprocess, and is likely not a problem with pip. full command: /mnt/disk/zsy/miniconda3/envs/evo1/bin/python3.10 /mnt/disk/zsy/miniconda3/envs/evo1/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py build_wheel /tmp/tmp76y1tsiq cwd: /tmp/pip-install-c6tmcbtz/flash-attn_5a9c566015794980994316c1ee0cb4cd Building wheel for flash-attn (pyproject.toml) ... error ERROR: Failed building wheel for flash-attn Failed to build flash-attn error: failed-wheel-build-for-install × Failed to build installable wheels for some pyproject.toml based projects ╰─> flash-attn
最新发布
12-04
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值