llama_index [Bug]:无法同时安装llama-index-multi-modal-llms-ollama和llama-index-llms-ollama

i7uq4tfw  于 23天前  发布在  其他
关注(0)|答案(2)|浏览(17)

Bug描述

由于ollama客户端的冲突版本,无法同时安装llama-index-multi-modal-llms-ollama和llama-index-llms-ollama。

版本

0.11.1

重现步骤

尝试安装llama-index-embeddings-ollama~=0.3.0 && llama-index-multi-modal-llms-ollama~=0.2.0 && llama-index-llms-ollama 0.3.0

相关日志/回溯

77.79 INFO: pip is looking at multiple versions of llama-index-multi-modal-llms-ollama to determine which version is compatible with other requirements. This could take a while.   
77.86 ERROR: Cannot install -r requirements.txt (line 69), -r requirements.txt (line 78) and -r requirements.txt (line 87) because these package versions have conflicting dependencies.
77.86
77.86 The conflict is caused by:
77.86     llama-index-embeddings-ollama 0.3.0 depends on ollama<0.4.0 and >=0.3.1
77.86     llama-index-llms-ollama 0.3.0 depends on ollama>=0.3.0
77.86     llama-index-multi-modal-llms-ollama 0.2.0 depends on ollama<0.2.0 and >=0.1.6
77.86
77.86 To fix this you could try to:
77.86 1. loosen the range of package versions you've specified
77.86 2. remove package versions to allow pip to attempt to solve the dependency conflict
77.86
77.88 ERROR: ResolutionImpossible: for help visit https://pip.pypa.io/en/latest/topics/dependency-resolution/#dealing-with-dependency-conflicts
------
failed to solve: process "/bin/sh -c apt-get remove -y python3-yaml &&     python3 -m pip install -r requirements.txt" did not complete successfully: exit code: 1
mwg9r5ms

mwg9r5ms1#

我找到了一个类似的开放问题:[Bug]: Conflicting versions。建议的解决方案是调整您的pyproject.toml文件中的依赖关系,以确保指定版本之间的兼容性。以下是一个示例配置:

[tool.poetry.dependencies]
python = ">=3.8.1,<4.0"
llama-index-core = "^0.10.1"
torch = ">=2.1.2,<2.3.1" # Adjusted to avoid conflict with torch 2.3.1
pillow = ">=10.2.0,<10.4.0" # Adjusted to avoid conflict with Pillow 10.4.0
torchvision = "^0.17.0"
ftfy = "^6.1.3"
llama-index-multi-modal-llms-ollama = "~0.1.3"
llama-index-llms-ollama = "~0.2.2"

[tool.poetry.group.dev.dependencies]
ipython = "8.10.0"
jupyter = "^1.0.0"
mypy = "0.991"
pre-commit = "3.2.0"
pylint = "2.15.10"
pytest = "7.2.1"
pytest-mock = "3.11.1"
ruff = "0.0.292"
tree-sitter-languages = "^1.8.0"
types-Deprecated = ">=0.1.0"
types-PyYAML = "^6.0.12.12"
types-protobuf = "^4.24.0.4"
types-redis = "4.5.5.0"
types-requests = "2.28.11.8"
types-setuptools = "67.1.0.0"

[tool.poetry.group.dev.dependencies.black]
extras = ["jupyter"]
version = "<=23.9.1,>=23.7.0"

[tool.poetry.group.dev.dependencies.codespell]
extras = ["toml"]
version = ">=v2.2.6"


此配置确保了`torch`、`Pillow`、`llama-index-multi-modal-llms-ollama`和`llama-index-llms-ollama` [[1]](https://github.com/run-llama/llama_index/issues/15317)之间指定版本的兼容性。
此外,您可能还想考虑使用`poetry`来管理您的依赖关系,因为它可以更有效地解决冲突。有关更多详细信息,请参阅已关闭的问题[[Bug]: LLAMA INDEX is becoming a Dependency Hell itself](https://github.com/run-llama/llama_index/issues/13441) [[2]](https://github.com/run-llama/llama_index/issues/13441)。

* 要继续与[Dosu](https://dosu.dev)交流,请提及@dosu。*
zf9nrax1

zf9nrax12#

llama-index-multi-modal-llms-ollama包需要内部更新,以支持ollama客户端0.3.0。

相关问题