Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Packages installing submodules cannot coexist #1793

Closed
ggould-tri opened this issue Mar 11, 2024 · 1 comment
Closed

Packages installing submodules cannot coexist #1793

ggould-tri opened this issue Mar 11, 2024 · 1 comment

Comments

@ggould-tri
Copy link

ggould-tri commented Mar 11, 2024

Affected Rule

The issue is caused by the rule: pip_parse

Is this a regression?

No

Description

If two packages install submodules of the same parent module, and if the parent module contains even a vacuous __init__.py file, then a target cannot successfully depend on both packages.

For instance, consider the packages nvidia_cublas_cu11 and nvidia_cuda_cupti_cu11. Each defines a submodule of the common, otherwise empty parent module nvidia. Running pip install nvidia_cublas_cu11 nvidia_cuda_cupti_cu11 will result in a python environment in which both modules can be imported. However a bazel environment provide both externals cannot do so.

🔬 Minimal Reproduction

Here is an example github repository: https://github.com/ggould-tri/rules_python_dup_test

git clone https://github.com/ggould-tri/rules_python_dup_test
cd https://github.com/ggould-tri/rules_python_dup_test

If you run the following, the test runs without bazel and succeeds:

python3 -m venv my_venv ; my_venv/bin/pip3 install nvidia_cublas_cu11 nvidia_cuda_cupti_cu11 ; my_venv/bin/python3 nvidia_submodules_test.py 

If you run the following, the test runs with bazel and fails:

bazel test --test_output=streamed //:nvidia_submodules_test

🔥 Exception or Error

There is no bazel error. The python error is

    import nvidia.cuda_cupti
ModuleNotFoundError: No module named 'nvidia.cuda_cupti'

🌍 Your Environment

Operating System:

  
Ubuntu 20.04
  

Output of bazel version:

  
Build label: 6.4.0
Build target: bazel-out/k8-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Thu Oct 19 17:07:43 2023 (1697735263)
Build timestamp: 1697735263
Build timestamp as int: 1697735263
  

Rules_python version:

  
0.31.0
  

Anything else relevant?

The underlying issue here is that while pip installs every package to a common site-packages, rules_python installs each package separately; as a result the pythonpath lookup of the parent module finds one of the packages; whichever one it finds does not contain the other, and so a target may never import both.

As commented in the linked repo, experimental_requirement_cycles doesn't help with this. However the solution might require a new feature similar to experimental_requirement_cycles, in that it will require declaring groups of packages; these groups must share a single site-packages directory into which each member installs.

It's worth noting that this isn't just an NVidia thing: google places the contents of both the google_auth and protobuf packages under a google supermodule. The same is likely true of other commercial vendors.

@groodt
Copy link
Collaborator

groodt commented Aug 24, 2024

Closing as duplicate of tracking issue #2156

@groodt groodt closed this as completed Aug 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants