vllm 安装:导入llm时遇到错误

ttcibm8c  于 4个月前  发布在  其他
关注(0)|答案(3)|浏览(71)

你当前的环境

Traceback (most recent call last):
  File "inference.py", line 355, in <module>
    data_all_with_response = get_pred_func(data=data_all, task_prompt=task_prompt,\
  File "inference.py", line 24, in get_pred_vllm
    from vllm import LLM, SamplingParams
  File "/usr/local/lib/python3.8/dist-packages/vllm/__init__.py", line 3, in <module>
    from vllm.engine.arg_utils import AsyncEngineArgs, EngineArgs
  File "/usr/local/lib/python3.8/dist-packages/vllm/engine/arg_utils.py", line 6, in <module>
    from vllm.config import (CacheConfig, ModelConfig, ParallelConfig,
  File "/usr/local/lib/python3.8/dist-packages/vllm/config.py", line 9, in <module>
    from vllm.utils import get_cpu_memory, is_hip
  File "/usr/local/lib/python3.8/dist-packages/vllm/utils.py", line 8, in <module>
    from vllm._C import cuda_utils
ImportError: /usr/local/lib/python3.8/dist-packages/vllm/_C.cpython-38-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops15to_dtype_layout4callERKNS_6TensorEN3c108optionalINS5_10ScalarTypeEEENS6_INS5_6LayoutEEENS6_INS5_6DeviceEEENS6_IbEEbbNS6_INS5_12MemoryFormatEEE```

### How you are installing vllm

```sh
export VLLM_VERSION=0.2.4
export PYTHON_VERSION=38
pip install https://github.com/vllm-project/vllm/releases/download/v${VLLM_VERSION}/vllm-${VLLM_VERSION}+cu118-cp${PYTHON_VERSION}-cp${PYTHON_VERSION}-manylinux1_x86_64.whl --extra-index-url https://download.pytorch.org/whl/cu118
zpjtge22

zpjtge221#

你尝试过export VLLM_VERSION=0.4.0.post1吗?0.2.4相当古老。

vsaztqbk

vsaztqbk2#

我使用了上述配置,遇到了另一个问题:

export VLLM_VERSION=0.4.0
export PYTHON_VERSION=38
pip install [https://github.com/vllm-project/vllm/releases/download/v${VLLM_VERSION}/vllm-${VLLM_VERSION}+cu118-cp${PYTHON_VERSION}-cp${PYTHON_VERSION}-manylinux1_x86_64.whl](https://github.com/vllm-project/vllm/releases/download/v$%7BVLLM_VERSION%7D/vllm-$%7BVLLM_VERSION%7D+cu118-cp$%7BPYTHON_VERSION%7D-cp$%7BPYTHON_VERSION%7D-manylinux1_x86_64.whl) --extra-index-url [https://download.pytorch.org/whl/cu118](https://download.pytorch.org/whl/cu118)
pip install flash_attn==2.2.0

在文件"/usr/local/lib/python3.8/dist-packages/vllm/attention/backends/flash_attn.py"的第195行,出现了TypeError: flash_attn_varlen_func() got an unexpected keyword argument 'window_size'。

svgewumm

svgewumm3#

你解决了这个问题吗?

相关问题