ubuntu安装flash_attn

安装 flash_attn 需要注意:
* flash_sttn 依赖 cuda-11.6及以上的版本,使用命令 nvcc --version 查看 cuda 的版本,cuda下载地址
* 检查 pytorch 版本和 cuda 版本是否匹配
* 需要提前安装ninja,否则编译过程会持续很长时间,如果ninja已经安装完毕,可以直接执行pip install flash-attn --no-build-isolation 来安装 flash_attn
* 即便是提前安装好了ninja,直接pip的话编译过程还是会超级慢,可以使用源码安装:
bash git clone https://github.com/Dao-AILab/flash-attention.git cd flash-attention python3 setup.py install
* 另外,目前原生的flash attention仅支持AmpereHopper等架构的GPU,例如:A100、H100等,V100属于Volta架构并不支持。

Collecting flash_attn Using cached https://pypi.tuna.tsinghua.edu.cn/packages/11/34/9bf60e736ed7bbe15055ac2dab48ec67d9dbd088d2b4ae318fd77190ab4e/flash_attn-2.7.4.post1.tar.gz (6.0 MB) Preparing metadata (setup.py) ... error error: subprocess-exited-with-error × python setup.py egg_info did not run successfully. │ exit code: 1 ╰─> [19 lines of output] /tmp/pip-install-d2werw7s/flash-attn_08e80afa06c24062902c2fd6836d8019/setup.py:106: UserWarning: flash_attn was requested, but nvcc was not found. Are you sure your environment has nvcc available? If you're installing within a container from https://hub.docker.com/r/pytorch/pytorch, only images whose names contain 'devel' will provide nvcc. warnings.warn( Traceback (most recent call last): File "<string>", line 2, in <module> File "<pip-setuptools-caller>", line 34, in <module> File "/tmp/pip-install-d2werw7s/flash-attn_08e80afa06c24062902c2fd6836d8019/setup.py", line 198, in <module> CUDAExtension( File "/root/myenv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1130, in CUDAExtension library_dirs += library_paths(device_type="cuda") File "/root/myenv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 1264, in library_paths if (not os.path.exists(_join_cuda_home(lib_dir)) and File "/root/myenv/lib/python3.10/site-packages/torch/utils/cpp_extension.py", line 2525, in _join_cuda_home raise OSError('CUDA_HOME environment variable is not set. ' OSError: CUDA_HOME environment variable is not set. Please set it to your CUDA install root. torch.__version__ = 2.6.0+cu124 [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed × Encountered error while generating package metadata. ╰─> See above for output. note: This is an issue with the packa
03-15
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值