flash attention | ERROR: Failed to build installable wheels for some pyproject.toml based projects

python version = 3.10 

conda create -n fastervlm python=3.10 -y

直接安装flash_attn会报错:

      error: command '/usr/bin/g++' failed with exit code 1
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for flash-attn
  Running setup.py clean for flash-attn
Failed to build flash-attn
ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)

解决方法:

1. 找到flash_attn官方地址:

https://github.com/Dao-AILab/flash-attention/releases

2. 下载合适的版本,我下载的是:
flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

wget https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl

3. 安装

pip install torch==2.2.0+cu121 torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu121
pip install flash_attn-2.7.4.post1+cu12torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl --no-build-isolation

building 'flash_attn_2_cuda' extension creating build/temp.linux-x86_64-cpython-310/csrc/flash_attn creating build/temp.linux-x86_64-cpython-310/csrc/flash_attn/src g++ -pthread -B /home/yxx/miniconda3/envs/simpler_ori/compiler_compat -Wno-unused-result -Wsign-compare -DNDEBUG -fwrapv -O2 -Wall -fPIC -O2 -isystem /home/yxx/miniconda3/envs/simpler_ori/include -fPIC -O2 -isystem /home/yxx/miniconda3/envs/simpler_ori/include -fPIC -I/tmp/pip-install-ru5uufkn/flash-attn_d835e5ef515c4ef1ae25b02f8e2df32b/csrc/flash_attn -I/tmp/pip-install-ru5uufkn/flash-attn_d835e5ef515c4ef1ae25b02f8e2df32b/csrc/flash_attn/src -I/tmp/pip-install-ru5uufkn/flash-attn_d835e5ef515c4ef1ae25b02f8e2df32b/csrc/cutlass/include -I/home/yxx/miniconda3/envs/simpler_ori/lib/python3.10/site-packages/torch/include -I/home/yxx/miniconda3/envs/simpler_ori/lib/python3.10/site-packages/torch/include/torch/csrc/api/include -I/home/yxx/miniconda3/envs/simpler_ori/lib/python3.10/site-packages/torch/include/TH -I/home/yxx/miniconda3/envs/simpler_ori/lib/python3.10/site-packages/torch/include/THC -I/home/yxx/miniconda3/envs/simpler_ori/include -I/home/yxx/miniconda3/envs/simpler_ori/include/python3.10 -c csrc/flash_attn/flash_api.cpp -o build/temp.linux-x86_64-cpython-310/csrc/flash_attn/flash_api.o -O3 -std=c++17 -DTORCH_API_INCLUDE_EXTENSION_H -DPYBIND11_COMPILER_TYPE=\"_gcc\" -DPYBIND11_STDLIB=\"_libstdcpp\" -DPYBIND11_BUILD_ABI=\"_cxxabi1011\" -DTORCH_EXTENSION_NAME=flash_attn_2_cuda -D_GLIBCXX_USE_CXX11_ABI=0 In file included from /tmp/pip-install-ru5uufkn/flash-attn_d835e5ef515c4ef1ae25b02f8e2df32b/csrc/cutlass/include/cutlass/numeric_types.h:42, from csrc/flash_attn/flash_api.cpp:13: /tmp/pip-install-ru5uufkn/flash-attn_d835e5ef515c4ef1ae25b02f8e2df32b/csrc/cutlass/include/cutlass/bfloat16.h:48:10: fatal error: cuda_bf16.h: No such file or directory 48 | #include <cuda_bf16.h> | ^~~~~~~~~~~~~ compilation terminated. error: command '/usr/bin/g++' failed with exit code 1 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)在安装flash-attn过程中出现了上述报错,是什么原因
最新发布
07-02
(rdt) qfw@LAPTOP-IQ27EG3H:~/RoboticsDiffusionTransformer$ pip install flash-attn --no-build-isolation Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting flash-attn Downloading https://pypi.tuna.tsinghua.edu.cn/packages/11/34/9bf60e736ed7bbe15055ac2dab48ec67d9dbd088d2b4ae318fd77190ab4e/flash_attn-2.7.4.post1.tar.gz (6.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 8.2 MB/s eta 0:00:00 Preparing metadata (setup.py) ... done Requirement already satisfied: torch in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from flash-attn) (2.1.0) Collecting einops (from flash-attn) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/87/62/9773de14fe6c45c23649e98b83231fffd7b9892b6cf863251dc2afa73643/einops-0.8.1-py3-none-any.whl (64 kB) Requirement already satisfied: filelock in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.18.0) Requirement already satisfied: typing-extensions in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (4.13.2) Requirement already satisfied: sympy in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (1.14.0) Requirement already satisfied: networkx in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.4.2) Requirement already satisfied: jinja2 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.1.6) Requirement already satisfied: fsspec in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2025.3.2) Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cuda-runtime-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cuda-cupti-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cudnn-cu12==8.9.2.26 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (8.9.2.26) Requirement already satisfied: nvidia-cublas-cu12==12.1.3.1 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.3.1) Requirement already satisfied: nvidia-cufft-cu12==11.0.2.54 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (11.0.2.54) Requirement already satisfied: nvidia-curand-cu12==10.3.2.106 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (10.3.2.106) Requirement already satisfied: nvidia-cusolver-cu12==11.4.5.107 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (11.4.5.107) Requirement already satisfied: nvidia-cusparse-cu12==12.1.0.106 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.0.106) Requirement already satisfied: nvidia-nccl-cu12==2.18.1 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2.18.1) Requirement already satisfied: nvidia-nvtx-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: triton==2.1.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2.1.0) Requirement already satisfied: nvidia-nvjitlink-cu12 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from nvidia-cusolver-cu12==11.4.5.107->torch->flash-attn) (12.9.41) Requirement already satisfied: MarkupSafe>=2.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (3.0.2) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from sympy->torch->flash-attn) (1.3.0) Building wheels for collected packages: flash-attn DEPRECATION: Building 'flash-attn' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'flash-attn'. Discussion can be found at https://github.com/pypa/pip/issues/6334 Building wheel for flash-attn (setup.py) ... error error: subprocess-exited-with-error × python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [31 lines of output] torch.__version__ = 2.1.0+cu121 /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages/setuptools/__init__.py:94: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. !! ******************************************************************************** Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try `pip install --use-pep517`. ******************************************************************************** !! dist.fetch_build_eggs(dist.setup_requires) /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: BSD License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running bdist_wheel Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl error: Remote end closed connection without response [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)遇到这个问题怎么解决?
05-12
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值