-
Notifications
You must be signed in to change notification settings - Fork 5
Ensure that we always get the dev versions of the relevant packages #1
Comments
We could also, for each test, put the artefacts we want to test (latest setuptools wheel, pip master sdist, etc) in a directory |
Can we guarantee that |
According to https://github.com/pypa/virtualenv/blob/bcce79d2e827c32153a1a199ddbb99f289f7694d/virtualenv.py#L1014 it seems we cannot ^^ So a small http.server serving our artefacts with the matching |
The nice thing about us owning all these packages we can make the necessary changes to allow this. |
I'm not 100% sure what tests we're actually planning on doing, but we could build virtualenvs by using Ideally, a local index is the best option, assuming no tools clear environment variables (I know tox does for some, but I don't imagine it would be the pip ones, I think it's more |
If we go with a local index supplying all the files we need, I think we would want to do it in two stages - first stage builds the index, then the second stage uses it. I think we can probably mess with |
I’ve had very good experiences with devpi. Although it’s not precisely PyPI, it has some nice properties (lightweight, pure-python, supports overwriting releases) and simulates PyPI pretty well. I’d recommend it. |
As I mentioned in this post on the discourse, one of the tougher things about doing these integration tests right will be ensuring that all the build tools pull in the dev versions of the other relevant build tools. Considering that
virtualenv
,pip
andtox
will all be pulling packages from PyPI and will explicitly be trying to build isolated build environments, I imagine this will be kinda tough.I wanted to start a thread on this repo to discuss it, so we can have a dedicated issue rather than just having all the design decisions hashed out on one mega-thread on discourse.
I think the easiest way to make sure this works correctly is to spin up our own PyPI endpoint locally. We would first use the stable build tools to build wheels and sdists for each of the relevant dev tools, then download all the relevant files we're going to need for the other packages from PyPI before the tests start.
We could also try and use a caching proxy for PyPI where we pre-cache the dev versions, but I'm worried that there could be situations where the resolver ends up falling back to non-dev versions and we're suddenly silently not testing what we think we are.
The text was updated successfully, but these errors were encountered: