VLLM: Failed to infer device type 简单解决方案

该文章已生成可运行项目,
    def __init__(self, device: str = "auto") -> None:
        if device == "auto":
            # Automated device type detection
            from vllm.platforms import current_platform
            self.device_type = current_platform.device_type
            if not self.device_type:
                raise RuntimeError(
                    "Failed to infer device type, please set "
                    "the environment variable `VLLM_LOGGING_LEVEL=DEBUG` "
                    "to turn on verbose logging to help debug the issue.")
        else:
            # Device type is assigned explicitly
            self.device_type = device

        # Some device types require processing inputs on CPU
        if self.device_type in ["neuron"]:
            self.device = torch.device("cpu")
        elif self.device_type in ["tpu"]:
            self.device = None
        else:
            # Set device with device type
            self.device = torch.device(self.device
本文章已经生成可运行项目
RuntimeError: Failed to infer device type, please set the environment variable `VLLM_LOGGING_LEVEL=DEBUG` to turn on verbose logging to help debug the issue. 2025-07-25 13:35:45 INFO 07-24 22:35:45 [__init__.py:248] No platform detected, vLLM is running on UnspecifiedPlatform 2025-07-25 13:35:45 WARNING 07-24 22:35:45 [_custom_ops.py:20] Failed to import from vllm._C with ImportError('\x01: cannot open shared object file: No such file or directory') 2025-07-25 13:37:16 INFO 07-24 22:37:16 [__init__.py:248] No platform detected, vLLM is running on UnspecifiedPlatform 2025-07-25 13:37:16 WARNING 07-24 22:37:16 [_custom_ops.py:20] Failed to import from vllm._C with ImportError('\x01: cannot open shared object file: No such file or directory') 2025-07-25 13:37:19 Traceback (most recent call last): 2025-07-25 13:37:19 File "<frozen runpy>", line 198, in _run_module_as_main 2025-07-25 13:37:19 File "<frozen runpy>", line 88, in _run_code 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/vllm/entrypoints/openai/api_server.py", line 1491, in <module> 2025-07-25 13:37:19 parser = make_arg_parser(parser) 2025-07-25 13:37:19 ^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/vllm/entrypoints/openai/cli_args.py", line 266, in make_arg_parser 2025-07-25 13:37:19 parser = AsyncEngineArgs.add_cli_args(parser) 2025-07-25 13:37:19 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 1717, in add_cli_args 2025-07-25 13:37:19 parser = EngineArgs.add_cli_args(parser) 2025-07-25 13:37:19 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 906, in add_cli_args 2025-07-25 13:37:19 vllm_kwargs = get_kwargs(VllmConfig) 2025-07-25 13:37:19 ^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 285, in get_kwargs 2025-07-25 13:37:19 return copy.deepcopy(_compute_kwargs(cls)) 2025-07-25 13:37:19 ^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 189, in _compute_kwargs 2025-07-25 13:37:19 default = field.default_factory() 2025-07-25 13:37:19 ^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/pydantic/_internal/_dataclasses.py", line 123, in __init__ 2025-07-25 13:37:19 s.__pydantic_validator__.validate_python(ArgsKwargs(args, kwargs), self_instance=s) 2025-07-25 13:37:19 File "/usr/local/lib/python3.12/dist-packages/vllm/config.py", line 2413, in __post_init__ 2025-07-25 13:37:19 raise RuntimeError( 2025-07-25 13:37:19 RuntimeError: Failed to infer device type, please set the environment variable `VLLM_LOGGING_LEVEL=DEBUG` to turn on verbose logging to help debug the issue.
07-26
2025-07-25 13:55:24 INFO 07-24 22:55:24 [__init__.py:248] No platform detected, vLLM is running on UnspecifiedPlatform 2025-07-25 13:55:31 Traceback (most recent call last): 2025-07-25 13:55:31 File "<frozen runpy>", line 198, in _run_module_as_main 2025-07-25 13:55:31 File "<frozen runpy>", line 88, in _run_code 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/vllm/entrypoints/openai/api_server.py", line 1491, in <module> 2025-07-25 13:55:31 parser = make_arg_parser(parser) 2025-07-25 13:55:31 ^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/vllm/entrypoints/openai/cli_args.py", line 266, in make_arg_parser 2025-07-25 13:55:31 parser = AsyncEngineArgs.add_cli_args(parser) 2025-07-25 13:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 1717, in add_cli_args 2025-07-25 13:55:31 parser = EngineArgs.add_cli_args(parser) 2025-07-25 13:55:31 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 906, in add_cli_args 2025-07-25 13:55:31 vllm_kwargs = get_kwargs(VllmConfig) 2025-07-25 13:55:31 ^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 285, in get_kwargs 2025-07-25 13:55:31 return copy.deepcopy(_compute_kwargs(cls)) 2025-07-25 13:55:31 ^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/vllm/engine/arg_utils.py", line 189, in _compute_kwargs 2025-07-25 13:55:31 default = field.default_factory() 2025-07-25 13:55:31 ^^^^^^^^^^^^^^^^^^^^^^^ 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/pydantic/_internal/_dataclasses.py", line 123, in __init__ 2025-07-25 13:55:31 s.__pydantic_validator__.validate_python(ArgsKwargs(args, kwargs), self_instance=s) 2025-07-25 13:55:31 File "/usr/local/lib/python3.12/dist-packages/vllm/config.py", line 2413, in __post_init__ 2025-07-25 13:55:31 raise RuntimeError( 2025-07-25 13:55:31 RuntimeError: Failed to infer device type, please set the environment variable `VLLM_LOGGING_LEVEL=DEBUG` to turn on verbose logging to help debug the issue.
07-26
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值