Skip to content
This repository has been archived by the owner on May 20, 2021. It is now read-only.

Ensure that we always get the dev versions of the relevant packages #1

Open
pganssle opened this issue Jan 30, 2019 · 7 comments
Open

Comments

@pganssle
Copy link
Member

As I mentioned in this post on the discourse, one of the tougher things about doing these integration tests right will be ensuring that all the build tools pull in the dev versions of the other relevant build tools. Considering that virtualenv, pip and tox will all be pulling packages from PyPI and will explicitly be trying to build isolated build environments, I imagine this will be kinda tough.

I wanted to start a thread on this repo to discuss it, so we can have a dedicated issue rather than just having all the design decisions hashed out on one mega-thread on discourse.

I think the easiest way to make sure this works correctly is to spin up our own PyPI endpoint locally. We would first use the stable build tools to build wheels and sdists for each of the relevant dev tools, then download all the relevant files we're going to need for the other packages from PyPI before the tests start.

We could also try and use a caching proxy for PyPI where we pre-cache the dev versions, but I'm worried that there could be situations where the resolver ends up falling back to non-dev versions and we're suddenly silently not testing what we think we are.

@xavfernandez
Copy link
Member

I think the easiest way to make sure this works correctly is to spin up our own PyPI endpoint locally.

We could also, for each test, put the artefacts we want to test (latest setuptools wheel, pip master sdist, etc) in a directory packages and use pip options --no-index --find-links=./packages.

@pganssle
Copy link
Member Author

We could also, for each test, put the artefacts we want to test (latest setuptools wheel, pip master sdist, etc) in a directory packages and use pip options --no-index --find-links=./packages.

Can we guarantee that virtualenv is using these pip options, though? If we don't give the integration tests access to the public PyPI during the installation tests, we're guaranteed that the tests were run using the dev versions. If not, we may install setuptools==41.0.0.deva394ad33cffeda in our current environment only to have pip and virtualenv install setuptools==40.7.0 in their respective isolated environments.

@xavfernandez
Copy link
Member

Can we guarantee that virtualenv is using these pip options, though?

According to https://github.com/pypa/virtualenv/blob/bcce79d2e827c32153a1a199ddbb99f289f7694d/virtualenv.py#L1014 it seems we cannot ^^

So a small http.server serving our artefacts with the matching PIP_INDEX_URL might be simpler.

@gaborbernat
Copy link
Contributor

The nice thing about us owning all these packages we can make the necessary changes to allow this.

@pfmoore
Copy link
Member

pfmoore commented Jan 30, 2019

I'm not 100% sure what tests we're actually planning on doing, but we could build virtualenvs by using --no-setuptools and doing all of our installs "by hand". Or by putting our dev builds in a virtualenv_support directory (which is where virtualenv picks them up from).

Ideally, a local index is the best option, assuming no tools clear environment variables (I know tox does for some, but I don't imagine it would be the pip ones, I think it's more PYTHONPATH and the like).

@pganssle
Copy link
Member Author

Ideally, a local index is the best option, assuming no tools clear environment variables (I know tox does for some, but I don't imagine it would be the pip ones, I think it's more PYTHONPATH and the like).

If we go with a local index supplying all the files we need, I think we would want to do it in two stages - first stage builds the index, then the second stage uses it. I think we can probably mess with /etc/hosts or do something equivalent to prevent access to public PyPI at the network level in stage 2, then we don't have to worry as much about what respects environment vars.

@jaraco
Copy link
Member

jaraco commented Jan 30, 2019

I’ve had very good experiences with devpi. Although it’s not precisely PyPI, it has some nice properties (lightweight, pure-python, supports overwriting releases) and simulates PyPI pretty well. I’d recommend it.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants