-
Notifications
You must be signed in to change notification settings - Fork 218
Description
Priority
P2-High
OS type
Ubuntu
Hardware type
Xeon-GNR
Installation method
- Pull docker images from hub.docker.com
- Build docker images from source
- Other
- N/A
Deploy method
- Docker
- Docker Compose
- Kubernetes Helm Charts
- Other
- N/A
Running nodes
Single Node
What's the version?
main branch
v1.5rc branch
Description
Encountered other dependency conflicts while attempting to fix the issue.
```
× No solution found when resolving dependencies for split (markers: python_full_version >= '3.13'):
╰─▶ Because only the following versions of llama-index-llms-text-generation-inference are available:
llama-index-llms-text-generation-inference==0.1.0
llama-index-llms-text-generation-inference==0.1.1
llama-index-llms-text-generation-inference==0.1.2
llama-index-llms-text-generation-inference==0.1.3
llama-index-llms-text-generation-inference==0.2.0
llama-index-llms-text-generation-inference==0.2.1
llama-index-llms-text-generation-inference==0.2.2
llama-index-llms-text-generation-inference==0.3.0
llama-index-llms-text-generation-inference==0.3.1
llama-index-llms-text-generation-inference==0.3.2
llama-index-llms-text-generation-inference==0.3.3
and llama-index-llms-text-generation-inference<=0.1.2 depends on llama-index-core>=0.10.41,<0.11.0,
we can conclude that llama-index-llms-text-generation-inference<0.1.3 depends on
llama-index-core>=0.10.41,<0.11.0.
And because llama-index-llms-text-generation-inference==0.1.3
depends on llama-index-core>=0.10.57,<0.11.0, we can conclude that
llama-index-llms-text-generation-inference<0.2.0 depends on llama-index-core>=0.10.41,<0.11.0.
And because llama-index-llms-text-generation-inference>=0.2.0,<=0.2.2 depends
on llama-index-core>=0.11.0,<0.12.0 and llama-index-core>=0.12.0,<0.13.0, we can
conclude that all versions of llama-index-llms-text-generation-inference depend on
llama-index-core>=0.10.41,<0.13.0.
And because you require llama-index-core>=0.13.0 and llama-index-llms-text-generation-inference,
we can conclude that your requirements are unsatisfiable.
hint: While the active Python version is 3.11, the resolution failed for other Python versions
supported by your project. Consider limiting your project's supported Python versions using
`requires-python`.
× No solution found when resolving dependencies for split (markers: python_full_version >= '3.13'):
╰─▶ Because only the following versions of llama-index-llms-text-generation-inference are available:
llama-index-llms-text-generation-inference==0.1.0
llama-index-llms-text-generation-inference==0.1.1
llama-index-llms-text-generation-inference==0.1.2
llama-index-llms-text-generation-inference==0.1.3
llama-index-llms-text-generation-inference==0.2.0
llama-index-llms-text-generation-inference==0.2.1
llama-index-llms-text-generation-inference==0.2.2
llama-index-llms-text-generation-inference==0.3.0
llama-index-llms-text-generation-inference==0.3.1
llama-index-llms-text-generation-inference==0.3.2
llama-index-llms-text-generation-inference==0.3.3
and llama-index-llms-text-generation-inference<=0.1.2 depends on llama-index-core>=0.10.41,<0.11.0,
we can conclude that llama-index-llms-text-generation-inference<0.1.3 depends on
llama-index-core>=0.10.41,<0.11.0.
And because llama-index-llms-text-generation-inference==0.1.3
depends on llama-index-core>=0.10.57,<0.11.0, we can conclude that
llama-index-llms-text-generation-inference<0.2.0 depends on llama-index-core>=0.10.41,<0.11.0.
And because llama-index-llms-text-generation-inference>=0.2.0,<=0.2.2 depends
on llama-index-core>=0.11.0,<0.12.0 and llama-index-core>=0.12.0,<0.13.0, we can
conclude that all versions of llama-index-llms-text-generation-inference depend on
llama-index-core>=0.10.41,<0.13.0.
And because you require llama-index-core>=0.13.0 and llama-index-llms-text-generation-inference,
we can conclude that your requirements are unsatisfiable.
hint: While the active Python version is 3.11, the resolution failed for other Python versions
supported by your project. Consider limiting your project's supported Python versions using
`requires-python`.
### Reproduce steps
docker run --rm -v /var/run/docker.sock:/var/run/docker.sock aquasec/trivy:latest image 100.81.152.135:5000/opea/retriever:v1.5rc
### Raw log
```shell
Attachments
No response