conda env create -f environment.yaml Consider using the `--user` option or check the permissions.

博客主要讲述了Conda环境迁移的操作,包括环境迁移导出和在新环境中导入的命令。作者在操作过程中遇到报错,如'拒绝访问',尝试删除重装仍报错,最终通过执行'conda env update -f environment.yaml'解决问题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

create完后,update一下

conda env update -f environment.yaml

 

环境迁移导出:

conda env export > environment.yaml

在新环境中导入:

conda env create -f environment.yaml

我的一直报错:

Could not install packages due to an EnvironmentError: [WinError 5] 拒绝访问。

: 'd:\\miniconda3\\envs\\myenvs\\scripts\\flask.exe'

Consider using the `--user` option or check the permissions.

删除重装:

conda env remove -n sbd_attendance

依旧报错。

试了好久,create就报错,然后自己想是不是能update一下:

conda env update -f environment.yaml

结果成功了,晕 @_@ 。。。。

root@autodl-container-c85144bc1a-7cf0bfa7:~# # 验证ONNX Runtime版本 python -c "import onnxruntime as ort; print(ort.__version__)" # 需要 >=1.15.0 (2024年模型要求) # 检查CUDA可用性(如使用GPU) python -c "import onnxruntime; print(onnxruntime.get_device())" # 应输出:['CPU', 'GPU:0'] 或类似 # 若出现CUDA错误(参考引用[3]),添加环境变量 export CUDA_LAUNCH_BLOCKING=1 Traceback (most recent call last): File "<string>", line 1, in <module> ModuleNotFoundError: No module named 'onnxruntime' Traceback (most recent call last): File "<string>", line 1, in <module> ModuleNotFoundError: No module named 'onnxruntime' root@autodl-container-c85144bc1a-7cf0bfa7:~# # 示例正确配置 sherpa_onnx: model_dir: "/root/autodl-tmp/Open-LLM-VTuber/models/sherpa-onnx-sense-voice-zh-en-ja-ko-yue-2024-07-17" sample_rate: 16000 decoding_method: "greedy_search" bash: sherpa_onnx:: command not found bash: model_dir:: command not found bash: sample_rate:: command not found bash: decoding_method:: command not found root@autodl-container-c85144bc1a-7cf0bfa7:~# import onnxruntime as ort def load_model(model_path): try: sess = ort.InferenceSession(model_path) print(f"✅ 成功加载模型: {model_path}") print(f"输入形状: {sess.get_inputs()[0].shape}") print(f"输出形状: {sess.get_outputs()[0].shape}") except Exception as e: print(f"❌ 加载失败: {str(e)}") # 测试关键模型 load_model("/root/autodl-tmp/.../encoder.onnx") load_model("/root/autodl-tmp/.../decoder.onnx") bash: import: command not found bash: syntax error near unexpected token `(' bash: try:: command not found bash: syntax error near unexpected token `(' bash: syntax error near unexpected token `f"✅ 成功加载模型: {model_path}"' bash: syntax error near unexpected token `f"输入形状: {sess.get_inputs()[0].shape}"' bash: syntax error near unexpected token `f"输出形状: {sess.get_outputs()[0].shape}"' bash: except: command not found bash: syntax error near unexpected token `f"❌ 加载失败: {str(e)}"' bash: syntax error near unexpected token `"/root/autodl-tmp/.../encoder.onnx"' bash: syntax error near unexpected token `"/root/autodl-tmp/.../decoder.onnx"' root@autodl-container-c85144bc1a-7cf0bfa7:~/autodl-tmp/Open-LLM-VTuber# pip install netron python -m netron encoder.onnx Looking in indexes: http://mirrors.aliyun.com/pypi/simple Collecting netron Downloading http://mirrors.aliyun.com/pypi/packages/86/d5/8b72b3bcf717c765945014d41e28a3d3ef67e66965c65cc325a73dbfd097/netron-8.4.4-py3-none-any.whl (1.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.9/1.9 MB 4.4 MB/s eta 0:00:00 Installing collected packages: netron Successfully installed netron-8.4.4 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv /root/miniconda3/bin/python: No module named netron.__main__; 'netron' is a package and cannot be directly executed root@autodl-container-c85144bc1a-7cf0bfa7:~/autodl-tmp/Open-LLM-VTuber# pip install onnxruntime==1.15.1 # 2024年模型常用兼容版本 Looking in indexes: http://mirrors.aliyun.com/pypi/simple Collecting onnxruntime==1.15.1 Downloading http://mirrors.aliyun.com/pypi/packages/2f/e2/ced4e64433097cb14425098ce3c6200b83d226005e8c23ba5bac44c89ab9/onnxruntime-1.15.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 5.9/5.9 MB 4.1 MB/s eta 0:00:00 Requirement already satisfied: numpy>=1.21.6 in /root/miniconda3/lib/python3.10/site-packages (from onnxruntime==1.15.1) (1.26.4) Requirement already satisfied: packaging in /root/miniconda3/lib/python3.10/site-packages (from onnxruntime==1.15.1) (24.1) Collecting flatbuffers Downloading http://mirrors.aliyun.com/pypi/packages/b8/25/155f9f080d5e4bc0082edfda032ea2bc2b8fab3f4d25d46c1e9dd22a1a89/flatbuffers-25.2.10-py2.py3-none-any.whl (30 kB) Requirement already satisfied: sympy in /root/miniconda3/lib/python3.10/site-packages (from onnxruntime==1.15.1) (1.12.1) Requirement already satisfied: protobuf in /root/miniconda3/lib/python3.10/site-packages (from onnxruntime==1.15.1) (4.25.3) Collecting coloredlogs Downloading http://mirrors.aliyun.com/pypi/packages/a7/06/3d6badcf13db419e25b07041d9c7b4a2c331d3f4e7134445ec5df57714cd/coloredlogs-15.0.1-py2.py3-none-any.whl (46 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 46.0/46.0 kB 2.5 MB/s eta 0:00:00 Collecting humanfriendly>=9.1 Downloading http://mirrors.aliyun.com/pypi/packages/f0/0f/310fb31e39e2d734ccaa2c0fb981ee41f7bd5056ce9bc29b2248bd569169/humanfriendly-10.0-py2.py3-none-any.whl (86 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 86.8/86.8 kB 3.8 MB/s eta 0:00:00 Requirement already satisfied: mpmath<1.4.0,>=1.1.0 in /root/miniconda3/lib/python3.10/site-packages (from sympy->onnxruntime==1.15.1) (1.3.0) Installing collected packages: flatbuffers, humanfriendly, coloredlogs, onnxruntime Successfully installed coloredlogs-15.0.1 flatbuffers-25.2.10 humanfriendly-10.0 onnxruntime-1.15.1 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv root@autodl-container-c85144bc1a-7cf0bfa7:~/autodl-tmp/Open-LLM-VTuber# ort_session = ort.InferenceSession("encoder.onnx", providers=['CPUExecutionProvider']) # 强制使用CPU bash: syntax error near unexpected token `(' bash: syntax error near unexpected token `)'
最新发布
07-18
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值