-
Notifications
You must be signed in to change notification settings - Fork 543
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: Add libs/python3.lib
to libpython target for SABI builds on Windows
#1820
fix: Add libs/python3.lib
to libpython target for SABI builds on Windows
#1820
Conversation
I just tried out the method mentioned in your fix. I made a small cc_binary
I looked at the resulting linker commands.
The compiler is now trying to link both python311.lib and python3.lib. I'm not sure what the implications are here and I'm surprised there are no duplicate definition errors produced. Do you know if this is a problem? |
Yes, that is potentially a problem - when building against the stable ABI, you do not want to link against the versioned python3XY.lib. This would probably be fine if the But, I'm more curious about the defines here: I'm mentioning Can you share the linker flags when building with |
Sorry, my mistake. I have been misreading "LIMITED_API" as "LIMITED_ABI" the whole time. Let me try once more with the correct spelling. Update: Same issue with the fixed spelling, now inside the defines=[...] attribute as requested. I verified the object file was compiled with Py_LIMITED_API by looking at the compilation flags which appear below.
I guess whether you define the Py_LIMITED_API or not, the libraries get linked anyway due to Bazel's rules. It makes sense to me that Bazel would instruct the compiler to link all srcs mentioned in cc_library dependency. |
I mean, so much is expected - the headers appearing in your compilation command there are always versioned to the Python distribution they are from. Also, from the pyconfig.h snippet, the conditional declaration of either python3.lib or python3X.lib means that the You can see that this is true e.g. by checking the CI statuses of nicholasjng/nanobind-bazel#16, where I completely removed the TL,DR: Explicitly specifying the current libs on Windows is normally not necessary because of the selection mechanism in pyconfig.h, and perhaps counterproductive for the reasons you mentioned (Bazel links all libs appearing as deps in a command). |
I'm thinking about what you said here.
In my project, I have a python_repository provided by python_rules for Python version 3.11. It has registered this version as the default Python toolchain. Outside of the project folder, my system has no globally installed Python version 3.11. Within my project, I want to embed a Python interpreter / write Python extensions for Python 3.11. I am hitting the hermetic Python use case that you are mentioning right? If I don't include these dependencies, the #pragma will still try to link python311.lib and I will get a linker error (if I still include the current_py_cc_headers). On the other hand, if I installed Python 3.10 on my system level (and don't include current_py_cc_headers or current_py_cc_libs), msvc would find the system python310.lib/python310.dll and link my project's binaries against those on account of the #pragma directives. But if I write an extension this way, and use it as a dependency for a py_binary, is it going to cause issues? Both python310.dll and python311.dll will be loaded at runtime, defining many of the same symbols. So if I take that route, I must be sure that the system python matches the hermetic python. Does that match your understanding? |
I think we mean different things: I am referring to "hermetic Python" as in "a configured rules_python toolchain", not as an embedded interpreter. MSVC invocations will not link against anything on the system other than the configured rules_python toolchain. (To verify, see the 3.12 build failure in the PR that I linked - I install Python 3.12 on the runner early on, but the build still fails due to a linker error, even if the system Python 3.12 has all of the required libs.) The only way to screw up hermeticity is to link libs from the system Python explicitly in your package setup code, i.e. by supplying EDIT: Ah, now I get it. Yes, the current libs will be taken from the rules_python toolchain as well - if you want to redistribute SABI extensions, you package |
Okay I still can't get my head around how you can remove the current_py_cc_libs dependency and still have your cc_library compile. On my computer I get the following error, which from my point of view is completely expected. Without the dependency, msvc is never told where the lib file is.
But I think I'm a little out of my depth here so I'll need to go off on my own and read more about Windows/Bazel/Python. Unrelatedly, I did uncover something interesting. Using Process Explorer and launching the Python interpreter, I found it's completely normal for the Python process to load in both python3.dll and pythonxy.dll. But this may just be because there are transitive dependencies being imported at startup, where every single pyd only depends on either python3 or pythonxy. In any case I imagine it is safe to add the python*.dll into the sources in your PR which would fix #1823 Thanks for taking up this issue 👍 |
I don't have a Windows machine around so I am happy that you are investigating this and attempting to make building on Windows better. Let me know when you would like a review from the maintainer point of view. The checklist would be:
Let me know if you would like any ideas/suggestions from me or other maintainers. |
@aignas Yes please, this is ready for review. I can copy the backstory over to a comment in the thread and paste it here, the issue you mentioned is technically about a slightly different use. Is the PR title (or a slightly more elaborate description thereof) sufficient as a changelog entry? PS: If you're fine with it, I would add the other versioned DLL mentioned in #1823 to the list, and then it should actually also fix that issue. |
The PR description is the commit message once the PR is merged and the changelog.md file still needs to be modified manually. As for the backstory, having it as a comment in this PR may be also sufficient. No need to create an issue just for that. I am curious if it would be possible to add a test target (maybe under tests/toolchains?) Where we could ensure that the new code works? |
7c2f168
to
7da5951
Compare
The following is the backstory of this PR, previously found in the description. BackstoryI am currently implementing Bazel support for the nanobind project, which makes creation of C++ Python bindings very easy. It is the successor to pybind11 in that regard. Development efforts on this happen in the nanobind-bazel repository. One particular feature in nanobind is targeting the Python stable ABI, which can be used to shrink the build matrix for Python wheels by promising compatibility across minor versions of Python even with C++ extensions. Due to later additions of nanobind prerequisites to the Python limited API, targeting the stable ABI using nanobind only became possible starting with Python 3.12. I added a Bazel config to the easiest, "hello world"-ish nanobind example project here, which used to build just fine on all platforms (Win/MacOS/Linux) and versions (Python 3.8-3.12), but only because I was breaking hermeticity in the Windows case, passing the libdir of the system interpreter to the build as a linkopt on Windows in Python (in the setup.py, to be specific). (For an example of where this is still done, see https://github.com/google/benchmark/blob/d5c55e8c42a8782cb24f6011d0e88449237ab842/setup.py#L71-L74.) Once I stopped doing that, I started getting build errors for Win+Python 3.12, as for example in this PR: nicholasjng/nanobind-bazel#16. Note that when building for Python 3.12, I target the stable ABI by setting the Py_LIMITED_API macro to 3.12. As would be expected from As a final point, none of this is an issue on Unix platforms, since those get their Python symbols dynamically at runtime without needing any linkage. Please let me know your thoughts. |
I added the changelog entry, and split off the backstory into a comment, removing it from the PR description. I also took the liberty to add the As for tests - I believe that since MSVC decides the library to link in the pyconfig.h header, we can add a regression test for this issue by compiling a simple |
I think //tests/cc/current_py_cc_libs:python_libs_linking_test is what you want? I created a simple cc_test to verify some linking behavior, but it didn't pass on windows and I couldn't figure out why. This thread sounds like the reason?
You're probably looking at e.g. tests/cc/current_py_cc_headers/current_py_cc_headers_tests.bzl ? Those are called "analysis tests" and are for verifying that the bzl logic in rule code is working. I wouldn't recommend trying to use those for this case -- what you have to do is look into the linker args and try to figure out if it looks right, which is fairly brittle and painful. I would just create cc_test targets that are built/run instead. |
Yes, that should be it. What do I need to expect in rules_python/tests/cc/current_py_cc_libs/current_py_cc_libs_tests.bzl Lines 52 to 57 in c5c03b2
|
Yeah. Just use |
Local Mac is still broken with
Looks like the library search path is messed up? |
I think the current libs test is fundamentally broken on MacOS (and potentially Windows too). The linkage of the
The first path is the baked-in dylib path of the hermetic Python's
Not sure what to do here. I think the best option is to create a new |
This patch does not fix the SABI build failure in my project, see nicholasjng/nanobind-bazel#18 and the CI run therein. It seems that the Possibly related thread (for MacOS): pyinstaller/pyinstaller#7582 |
I'm not too surprised by that :). There's lots of platform-specific idiosyncrasies and I'm by no means an linker expert, even less so with Windows and Mac. I don't really have a Mac or Windows machine to experiment with, either. I think the reality is someone with more knowledge/experience/motivation for those platforms will need to step up. The code powering the
Yes, that's fine. Incremental progress is good.
That From what I understand of Bazel's cc_test[1], the way it should be working is:
There might be some filename mangling and SONAME trickery, but I can't recall.
[1] cc_binary might behave differently btw. I recall seeing some code paths in the cc rule implementation that special cases cc_test vs cc_binary, but what activated them was convoluted at times, so ymmv. |
Not to worry, I'm here to (hopefully) see this through. I'm also not an expert, and every new linker topic I see prompts an extensive google search, but I'll get there :) It seems that on macOS, a binary called As a note, here is a blog post detailing the practice when bundling dylibs on macOS. Here's the full Output
It looks like the first LC_LOAD_DYLIB entry (for libpython3.11) is the culprit, but the rpath that Bazel set with the linker is correct. I'll try to add the test shortly, would appreciate feedback as your time permits. |
Yes, I agree. My guess is something is missing from the linker command to generate an LC_LOAD_DYLIB entry like |
I added a Windows test with |
When targeting the Python Stable ABI on Windows (by setting the Py_LIMITED_API macro to a Python minimum version hex), the unversioned python3.lib needs to be linked instead of the versioned one (e.g. python38.lib for Python 3.8). Python's own config sets the library to link by default in a header called pyconfig.h (https://github.com/python/cpython/blob/9cc9e277254023c0ca08e1a9e379fd89475ca9c2/PC/pyconfig.h#L270), which prompts the linker to search for python3.lib if a stable ABI extension is built using `@rules_python` toolchains. Since this library is not exported on Windows in the `python_repository()` rule, it's added now to allow Python SABI extensions to be built (and linked) on Windows with `@rules_python`. Since Python takes responsibility for linking the correct lib on Windows, and never both at the same time, no other changes are made.
Reuses the existing cc_test targeting the stable ABI on Windows. Since this prompts MSVC to search for libs/python3.lib, it serves as a check that the library search path is intact on Windows.
52f7b8d
to
26af5b5
Compare
This could potentially fix the header test.
@rickeylev I think the new |
Wow, empty output? That's sort of impressive. I'd have expected like a message about segfaulting or something. In any case, perhaps line 5 in python_libs_linking_test isn't actually triggering? That C++ code is buildable, but not runnable. That it's getting past the build step means the (build time) linking is happy enough. That it's failing at runtime is expected. |
Ah, nice! Do you have a preference on how to proceed? I'm guessing it's not as easy as adding an early |
Yeah, that's why the condition it checks is argc -- that's a runtime condition, so it can't optimize them away. Maybe print out argv or env vars and see if there's something to trigger in there? If not, adding something to args/env of the cc_test target would let us set a signal to check. |
ccb141c
to
5f4179e
Compare
Just tried to set an envvar, but that evidently did not work. Am I on the right track? (I still can't run this test locally because of the macOS dylib failure, btw, so I am a little slow here.) |
The former was probably an oversight, the latter is a consequence of the addition of free-threading, which wasn't there when this PR was first opened.
Come on dude, why will it not work. What's so special about Windows that the test won't even output logs anymore? tl,dr of my latest attempt: Introduce Maybe it's time to install that Windows virtualization thingy for Mac. |
Looks like the missing part was current_py_cc_libs in the deps of the test? Glad you figured it out! |
Thanks! If you'd like, I could also make another PR to enable the other test for Windows now. That effectively tests locating the non-ABI3 libs, which has value in its own right in my opinion. |
When targeting the Python Stable ABI on Windows (by setting the Py_LIMITED_API macro to a Python minimum version hex), the unversioned python3.lib needs to be linked instead of the versioned one (e.g. python38.lib for Python 3.8).
Python's own config sets the library to link by default in a header called pyconfig.h (https://github.com/python/cpython/blob/9cc9e277254023c0ca08e1a9e379fd89475ca9c2/PC/pyconfig.h#L270), which prompts the linker to search for python3.lib if a stable ABI extension is built using
@rules_python
toolchains.Since this library is not exported on Windows in the
python_repository()
rule, building Python C++ extensions with rules_python toolchains fails in the linking step, because the library is never copied. Consequently, it is added now to allow Python SABI extensions to be built (and linked) on Windows with@rules_python
.Since Python takes responsibility for linking the correct lib on Windows, and never both at the same time, no other changes are made.