Cetos7更改yum源镜像实现安装ninja-build软件包

提示:文章写完后,目录可以自动生成,如何生成可参考右边的帮助文档


前言

现有的目标跟踪算法的环境配置基本是在ubuntu18.04上配置的,现在我使用的服务器系统是centos7。由于Centos7安装包的工具是yum,而ubuntu18.04安装包的工具是apt-get,通常来说,在安装包的时候将apt-get换成yum是可以解决问题的,但有一些包(ninja-build、libturbojpeg)由于软件源的问题装不上,这期间走了很多弯路。。。


一、在centos上安装apt工具包(×)

网上搜索centos apt出现了一堆在centos上安装apt的教程,为了试这些教程,花费了大量的精力也没解决问题,无法获取包是源的问题,和用apt或yum的关系不大。

二、手动安装ninja-build包(√)

报错:密钥已安装,但不适用于此软件

在这里插入图片描述

解决办法:先下载rpm包到指定路径,手动安装rpm包

提示:后续实现了换源,这个方法仅供参考!!!
用yum下载rpm包到指定路径:

yum install --downloadonly --downloaddir=下载路径 包全名
https://www.cnblogs.com/yanjieli/archive/2019/04/17/10725360.html

手动安装rpm包:

rpm -ivh 包全名
http://c.biancheng.net/view/2872.html

查看yum安装的软件路径:

rpm -qa|grep 包名 #查找包的全名
rpm -ql 全名 #查找包的安装路径
https://blog.youkuaiyun.com/wd2014610/article/details/79659073

卸载rpm软件包

rpm -e 包全名
https://blog.youkuaiyun.com/weixin_44317658/article/details/112288407

三、安装libturbojpeg包

报错:没有可用的包

解决办法:

没有这个包是因为官方(https://www.libjpeg-turbo.org/)把这个包的名字改了。(也不一定是这个原因,Ubuntu下apt-get可以安装libturbojpeg包。)

sudo yum install libjpeg-turbo

四、prroi_pool相关问题

报错1:没有模块名为prroi_pool

在这里插入图片描述

解决办法:

按博客重新配置Precise ROI pooling库

https://blog.youkuaiyun.com/qq_17783559/article/details/117933369?spm=1001.2014.3001.5506

报错2:无效语法

在这里插入图片描述

解决办法:

已经是管理员账户的情况下无需设置环境变量,设置了反而会报这个错误。

报错3:无法构建扩展

在这里插入图片描述

解决办法:

在Ubuntu系统下遇到过nvcc找不到的问题/bin/sh:1:nvcc:not found。采用下面这个博客的方法配置后解决。

https://blog.youkuaiyun.com/weixin_43046653/article/details/100019901

五、换源epel后安装ninja-build包

  • 把软件源换为epel
[epel]
name=EPEL for redhat/centos $releasever - $basearch
failovermethod=priority
gpgcheck=1
gpgkey=http://mirrors.tencentyun.com/epel/RPM-GPG-KEY-EPEL-7
enabled=1
baseurl=http://mirrors.tencentyun.com/epel/$releasever/$basearch/

关键是把gpgkey(密钥存放的地方)对应更改。baseurl仅需将网址改为所需要的源,epel及之后不要动。

在这里插入图片描述


总结

以上主要是为了解决centos下使用yum安装软件的软件源问题,解决了这一问题后仍没有解决配置pytracking环境的问题,最终还是选择将服务器重装为Ubuntu系统,接下来会写一篇完整配置Ubuntu服务器版的过程。

(01_py36) eaibot@EAI_LEO:~$ pip install PyMuPDF opencv-python-headless --no-cache-dir Requirement already satisfied: PyMuPDF in ./miniconda3/envs/01_py36/lib/python3.6/site-packages (1.18.19) Collecting opencv-python-headless Downloading opencv-python-headless-4.11.0.86.tar.gz (95.2 MB) |████████████████████████████████| 95.2 MB 3.8 MB/s Installing build dependencies ... done Getting requirements to build wheel ... done Preparing metadata (pyproject.toml) ... done Requirement already satisfied: numpy>=1.13.3 in ./miniconda3/envs/01_py36/lib/python3.6/site-packages (from opencv-python-headless) (1.19.5) Building wheels for collected packages: opencv-python-headless Building wheel for opencv-python-headless (pyproject.toml) ... error ERROR: Command errored out with exit status 1: command: /home/eaibot/miniconda3/envs/01_py36/bin/python /home/eaibot/miniconda3/envs/01_py36/lib/python3.6/site-packages/pip/_vendor/pep517/in_process/_in_process.py build_wheel /tmp/tmp2qjf8_mk cwd: /tmp/pip-install-9xmfwwvn/opencv-python-headless_1c1dbdd61e574966bd2df529ca7b0201 Complete output (692 lines): -------------------------------------------------------------------------------- -- Trying 'Ninja' generator -------------------------------- --------------------------- ---------------------- ----------------- ------------ ------- -- Not searching for unused variables given on the command line. CMake Error: CMake was unable to find a build program corresponding to "Ninja". CMAKE_MAKE_PROGRAM is not set. You probably need to select a different build tool. -- Configuring incomplete, errors occurred! -- ------- ------------ ----------------- ---------------------- --------------------------- -------------------------------- -- Trying 'Ninja' generator - failure -------------------------------------------------------------------------------- -------------------------------------------------------------------------------- -- Trying 'Unix Makefiles' generator -------------------------------- --------------------------- ---------------------- ----------------- ------------ ------- -- Not searching for unused variables given on the command line. -- The C compiler identification is GNU 9.4.0 -- Check for working C compiler: /usr/bin/cc -- Check for working C compiler: /usr/bin/cc -- works -- Detecting C compiler ABI info -- Detecting C compiler ABI info - done -- Detecting C compile features -- Detecting C compile features - done -- The CXX compiler identification is GNU 9.4.0 -- Check for working CXX compiler: /usr/bin/c++ -- Check for working CXX compiler: /usr/bin/c++ -- works -- Detecting CXX compiler ABI info -- Detecting CXX compiler ABI info - done -- Detecting CXX compile features -- Detecting CXX compile features - done -- Configuring done -- Generating done -- Build files have been written to: /tmp/pip-install-9xmfwwvn/opencv-python-headless_1c1dbdd61e574966bd2df529ca7b0201/_cmake_test_co
03-21
(rdt) qfw@LAPTOP-IQ27EG3H:~/RoboticsDiffusionTransformer$ pip install flash-attn --no-build-isolation Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple Collecting flash-attn Downloading https://pypi.tuna.tsinghua.edu.cn/packages/11/34/9bf60e736ed7bbe15055ac2dab48ec67d9dbd088d2b4ae318fd77190ab4e/flash_attn-2.7.4.post1.tar.gz (6.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 8.2 MB/s eta 0:00:00 Preparing metadata (setup.py) ... done Requirement already satisfied: torch in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from flash-attn) (2.1.0) Collecting einops (from flash-attn) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/87/62/9773de14fe6c45c23649e98b83231fffd7b9892b6cf863251dc2afa73643/einops-0.8.1-py3-none-any.whl (64 kB) Requirement already satisfied: filelock in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.18.0) Requirement already satisfied: typing-extensions in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (4.13.2) Requirement already satisfied: sympy in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (1.14.0) Requirement already satisfied: networkx in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.4.2) Requirement already satisfied: jinja2 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (3.1.6) Requirement already satisfied: fsspec in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2025.3.2) Requirement already satisfied: nvidia-cuda-nvrtc-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cuda-runtime-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cuda-cupti-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: nvidia-cudnn-cu12==8.9.2.26 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (8.9.2.26) Requirement already satisfied: nvidia-cublas-cu12==12.1.3.1 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.3.1) Requirement already satisfied: nvidia-cufft-cu12==11.0.2.54 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (11.0.2.54) Requirement already satisfied: nvidia-curand-cu12==10.3.2.106 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (10.3.2.106) Requirement already satisfied: nvidia-cusolver-cu12==11.4.5.107 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (11.4.5.107) Requirement already satisfied: nvidia-cusparse-cu12==12.1.0.106 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.0.106) Requirement already satisfied: nvidia-nccl-cu12==2.18.1 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2.18.1) Requirement already satisfied: nvidia-nvtx-cu12==12.1.105 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (12.1.105) Requirement already satisfied: triton==2.1.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from torch->flash-attn) (2.1.0) Requirement already satisfied: nvidia-nvjitlink-cu12 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from nvidia-cusolver-cu12==11.4.5.107->torch->flash-attn) (12.9.41) Requirement already satisfied: MarkupSafe>=2.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from jinja2->torch->flash-attn) (3.0.2) Requirement already satisfied: mpmath<1.4,>=1.1.0 in /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages (from sympy->torch->flash-attn) (1.3.0) Building wheels for collected packages: flash-attn DEPRECATION: Building 'flash-attn' using the legacy setup.py bdist_wheel mechanism, which will be removed in a future version. pip 25.3 will enforce this behaviour change. A possible replacement is to use the standardized build interface by setting the `--use-pep517` option, (possibly combined with `--no-build-isolation`), or adding a `pyproject.toml` file to the source tree of 'flash-attn'. Discussion can be found at https://github.com/pypa/pip/issues/6334 Building wheel for flash-attn (setup.py) ... error error: subprocess-exited-with-error × python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [31 lines of output] torch.__version__ = 2.1.0+cu121 /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages/setuptools/__init__.py:94: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated. !! ******************************************************************************** Requirements should be satisfied by a PEP 517 installer. If you are using pip, you can try `pip install --use-pep517`. ******************************************************************************** !! dist.fetch_build_eggs(dist.setup_requires) /home/qfw/miniconda3/envs/rdt/lib/python3.10/site-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: BSD License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running bdist_wheel Guessing wheel URL: https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.1cxx11abiFALSE-cp310-cp310-linux_x86_64.whl error: Remote end closed connection without response [end of output] note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash-attn Running setup.py clean for flash-attn Failed to build flash-attn ERROR: Failed to build installable wheels for some pyproject.toml based projects (flash-attn)遇到这个问题怎么解决?
最新发布
05-12
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值