Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace vcr-py for tests #1198

Merged
merged 51 commits into from
Dec 3, 2024
Merged
Show file tree
Hide file tree
Changes from 30 commits
Commits
Show all changes
51 commits
Select commit Hold shift + click to select a range
4bbc8e9
Adapt test.sh to use proxay.
markbader Sep 25, 2024
db4a963
wip replace vcr-py with proxy.
markbader Sep 26, 2024
a9ad224
Merge branch 'master' into introduce-proxay-for-tests
markbader Oct 14, 2024
33eb8e1
working on formatting and cleaning responses.
markbader Oct 14, 2024
72dca38
Merge branch 'master' into introduce-proxay-for-tests
markbader Oct 14, 2024
13437bc
Add type hint.
markbader Oct 14, 2024
f0b1604
Update uv lock.
markbader Oct 15, 2024
e4bda84
Change API requests in tests to use local webknossos.
markbader Oct 15, 2024
999e76e
Experiment with failing tests.
markbader Oct 22, 2024
760c7b0
Merge branch 'master' into introduce-proxay-for-tests
markbader Oct 22, 2024
2b4d5a4
Remove download from pytest fixture.
markbader Oct 23, 2024
594e62c
Merge branch 'master' into introduce-proxay-for-tests
markbader Oct 24, 2024
1ff7de9
Update test-cassettes and fix some tests.
markbader Oct 24, 2024
b051312
Run linter.
markbader Oct 24, 2024
6e67c64
Merge branch 'master' into introduce-proxay-for-tests
markbader Oct 28, 2024
a2b4c08
Remove time strings from tests.
markbader Oct 28, 2024
20e540b
Adapt tests with zarr streaming.
markbader Oct 29, 2024
b48e4fa
Merge branch 'master' into introduce-proxay-for-tests
markbader Oct 29, 2024
031759e
Working on tests that use annotations.
markbader Nov 1, 2024
99f5df8
Add debugging test mode that starts replay proxy.
markbader Nov 4, 2024
15d8af7
Add fixture for aiohttp use env variables.
markbader Nov 7, 2024
6bce46d
Set l4_sample public for running the remote_mags tests.
markbader Nov 21, 2024
a6fbe04
Add cassette files for tests.
markbader Nov 21, 2024
be074fa
Merge branch 'master' into introduce-proxay-for-tests
markbader Nov 21, 2024
248389a
Merge branch 'master' into introduce-proxay-for-tests
markbader Nov 21, 2024
78353a0
Move access to remote MagViews out of fixture to capture requests in …
markbader Nov 21, 2024
85cc153
Run linter.
markbader Nov 21, 2024
53ae8f3
Update cassette for annotation_upload_download_roundtrip.
markbader Nov 22, 2024
6e1aed0
Update readme and contribution guideline.
markbader Nov 22, 2024
182b35d
Update ci.
markbader Nov 22, 2024
892364e
Remove groups from CI.
markbader Nov 22, 2024
0c1b9f4
Update cassettes.
markbader Nov 22, 2024
4364106
Merge branch 'master' into introduce-proxay-for-tests
markbader Nov 22, 2024
338bc2f
Revert changes of ci and add timeout to test.sh.
markbader Nov 22, 2024
dd6149c
Update timeout in test.sh.
markbader Nov 22, 2024
60c9547
Update test_duration file.
markbader Nov 22, 2024
95f3567
Adapt test examples.
markbader Nov 25, 2024
e864add
Merge branch 'master' into introduce-proxay-for-tests
markbader Nov 25, 2024
54aa84f
Remove unused import.
markbader Nov 25, 2024
16abfa1
Skip flaky tests for now.
markbader Nov 26, 2024
f5c0086
Merge branch 'master' into introduce-proxay-for-tests
markbader Nov 26, 2024
42c478e
Merge branch 'master' into introduce-proxay-for-tests
markbader Nov 26, 2024
a26f276
Update tests to use l4_sample dataset.
markbader Nov 26, 2024
f3c61d1
Merge branches 'introduce-proxay-for-tests' and 'introduce-proxay-for…
markbader Nov 26, 2024
79111f0
Update cassettes.
markbader Nov 27, 2024
8f4afe0
Fix test for apply_merger_mode.
markbader Nov 28, 2024
f61aec3
Fix remote_datasets test.
markbader Nov 28, 2024
e56c626
Fix tests for adding remote mag and layer.
markbader Nov 28, 2024
c31ff21
Run linter and typechecker.
markbader Nov 28, 2024
ed0ba4e
Fix test learned_segmenter.
markbader Nov 28, 2024
5c64446
Merge branch 'master' into introduce-proxay-for-tests
markbader Nov 28, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -166,7 +166,10 @@ jobs:
with:
# Install a specific version of uv.
version: "0.4.22"


- name: Install proxay
run: npm install -g proxay

- name: Set up Python ${{ matrix.python-version }}
run: uv python install ${{ matrix.python-version }}

Expand All @@ -182,6 +185,9 @@ jobs:
if: ${{ matrix.group == 1 && matrix.python-version == '3.11' }}
run: ./typecheck.sh

- name: Setup tmate session
uses: mxschmitt/action-tmate@v3

- name: Python tests
timeout-minutes: 30
env:
Expand Down
15 changes: 10 additions & 5 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -133,15 +133,20 @@ Internal workflows for scalable minds:

The `webknossos` folder contains examples, which are not part of the package, but are tested via `tests/test_examples.py` and added to the documentation (see `docs/src/webknossos-py/examples`).

To run the `./test.sh` script it is necessary to install `proxay`. This is either done with [NPM](https://www.npmjs.com) or [yarn](https://yarnpkg.com/getting-started/install):
```bash
npm install --global proxay

# or if you're using yarn
yarn global add proxay
```

The tests also contain functionality for the WEBKNOSSOS client. There a two modes to run the tests:

1. `./test.sh --refresh-snapshots`, sending network requests to a WEBKNOSSOS instance:
This expects a local WEBKNOSSOS setup with specific test data, which is shipped with WEBKNOSSOS. If you're starting and running WEBKNOSSOS manually, please use port 9000 (the default) and run the `tools/postgres/dbtool.js prepare-test-db` script in the WEBKNOSSOS repository (⚠️ this overwrites your local WEBKNOSSOS database). Alternatively, a `docker compose` setup is started automatically for the tests, see `./test.sh` and `tests/docker-compose.yml` for details. The network requests & response are recorded as "cassettes" by [vcr.py](https://vcrpy.readthedocs.io), see next point:
2. `./test.sh` replays responses from previous network snapshots using [vcr.py](https://vcrpy.readthedocs.io) via [pytest-recording](https://github.com/kiwicom/pytest-recording). No additional network requests are allowed in this mode.
This expects a local WEBKNOSSOS setup with specific test data, which is shipped with WEBKNOSSOS. If you're starting and running WEBKNOSSOS manually, please use port 9000 (the default) and run the `tools/postgres/dbtool.js prepare-test-db` script in the WEBKNOSSOS repository (⚠️ this overwrites your local WEBKNOSSOS database). Alternatively, a `docker compose` setup is started automatically for the tests, see `./test.sh` and `tests/docker-compose.yml` for details. The network requests & response are recorded as "cassettes" by [proxay](https://github.com/airtasker/proxay), see next point:
2. `./test.sh` replays responses from previous network snapshots using [proxay](https://github.com/airtasker/proxay).

`./test.sh --store-durations` updates the durations for
[`pytest-split`](https://jerry-git.github.io/pytest-split),
which is used in the CI to split the tests for different runners.

#### `cluster_tools` package

Expand Down
10 changes: 4 additions & 6 deletions cluster_tools/dockered-slurm/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
version: "2.2"

services:
mysql:
image: mysql:5.7
Expand All @@ -16,7 +14,7 @@ services:

slurmdbd:
image: scalableminds/slurm-docker-cluster:master__11274637426
command: ["slurmdbd"]
command: [ "slurmdbd" ]
container_name: slurmdbd
hostname: slurmdbd
volumes:
Expand All @@ -32,7 +30,7 @@ services:

slurmctld:
image: scalableminds/slurm-docker-cluster:master__11274637426
command: ["slurmctld"]
command: [ "slurmctld" ]
container_name: slurmctld
environment:
USER: "root"
Expand All @@ -53,7 +51,7 @@ services:

c1:
image: scalableminds/slurm-docker-cluster:master__11274637426
command: ["slurmd"]
command: [ "slurmd" ]
hostname: c1
container_name: c1
volumes:
Expand All @@ -71,7 +69,7 @@ services:

c2:
image: scalableminds/slurm-docker-cluster:master__11274637426
command: ["slurmd"]
command: [ "slurmd" ]
hostname: c2
container_name: c2
volumes:
Expand Down
2 changes: 0 additions & 2 deletions docs/uv.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions webknossos/Changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ For upgrade instructions, please check the respective _Breaking Changes_ section
### Added

### Changed
- Removes vcr-py from developer dependencies for testing and adds proxay for recording and replaying API requests. [#1198](https://github.com/scalableminds/webknossos-libs/pull/1198)

### Fixed

Expand Down
19 changes: 9 additions & 10 deletions webknossos/examples/accessing_metadata.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,19 +2,18 @@


def main() -> None:
with wk.webknossos_context(url="https://webknossos.org/"):
l4_sample_dataset = wk.Dataset.open_remote("l4_sample")
# Access the metadata of the dataset
print(l4_sample_dataset.metadata)
l4_sample_dataset = wk.Dataset.open_remote("l4_sample")
# Access the metadata of the dataset
print(l4_sample_dataset.metadata)

# Edit the metadata of the dataset
l4_sample_dataset.metadata["new_key"] = "new_value"
# Edit the metadata of the dataset
l4_sample_dataset.metadata["new_key"] = "new_value"

# Access metadata of a folder
print(l4_sample_dataset.folder.metadata)
# Access metadata of a folder
print(l4_sample_dataset.folder.metadata)

# Edit the metadata of the folder
l4_sample_dataset.folder.metadata["new_folder_key"] = "new_folder_value"
# Edit the metadata of the folder
l4_sample_dataset.folder.metadata["new_folder_key"] = "new_folder_value"


if __name__ == "__main__":
Expand Down
5 changes: 2 additions & 3 deletions webknossos/examples/apply_merger_mode.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,9 @@ def main() -> None:
###############################################

dataset = wk.Dataset.download(
"l4_sample_dev",
"scalable_minds",
"l4_sample",
"Organization_X",
path="testoutput/l4_sample_dev",
webknossos_url="https://webknossos.org",
)
in_layer = cast(wk.SegmentationLayer, dataset.get_layer("segmentation"))
in_mag1 = in_layer.get_mag("1")
Expand Down
4 changes: 1 addition & 3 deletions webknossos/examples/learned_segmenter.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
import os
from functools import partial
from tempfile import TemporaryDirectory
from time import gmtime, strftime

import numpy as np
from skimage import feature
Expand All @@ -20,8 +19,7 @@ def main() -> None:
# Step 1: Read the training data from the annotation and the dataset's color
# layer (the data will be streamed from WEBKNOSSOS to our local computer)
training_data_bbox = annotation.user_bounding_boxes[0] # type: ignore[index]
time_str = strftime("%Y-%m-%d_%H-%M-%S", gmtime())
new_dataset_name = annotation.dataset_name + f"_segmented_{time_str}"
new_dataset_name = f"{annotation.dataset_name}_segmented"
with wk.webknossos_context("https://webknossos.org"):
dataset = annotation.get_remote_annotation_dataset()

Expand Down
4 changes: 1 addition & 3 deletions webknossos/examples/upload_dicom_stack.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,12 @@
from pathlib import Path
from time import gmtime, strftime

import webknossos as wk


def main() -> None:
time_str = strftime("%Y-%m-%d_%H-%M-%S", gmtime())
dataset = wk.Dataset.from_images(
str(Path(__file__).parent.parent / "testdata" / "dicoms"),
f"dicom_dataset_{time_str}",
"dicom_dataset_upload",
voxel_size=(12, 12, 12),
)
dataset.compress()
Expand Down
10 changes: 2 additions & 8 deletions webknossos/examples/upload_image_data.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,3 @@
from time import gmtime, strftime

import numpy as np
from skimage import data

Expand All @@ -19,12 +17,8 @@ def main() -> None:
# we expect the following dimensions: Channels, X, Y, Z.
img = np.transpose(img, [1, 3, 2, 0])

# choose a name for our dataset
time_str = strftime("%Y-%m-%d_%H-%M-%S", gmtime())
name = f"cell_{time_str}"

# voxel_size is defined in nm
ds = wk.Dataset(name, voxel_size=(260, 260, 290))
# choose name and voxel size (voxel_size is defined in nm)
ds = wk.Dataset("cell_dataset", voxel_size=(260, 260, 290))

ds.default_view_configuration = DatasetViewConfiguration(zoom=0.35)

Expand Down
4 changes: 1 addition & 3 deletions webknossos/examples/upload_tiff_stack.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,12 @@
from pathlib import Path
from time import gmtime, strftime

import webknossos as wk


def main() -> None:
time_str = strftime("%Y-%m-%d_%H-%M-%S", gmtime())
dataset = wk.Dataset.from_images(
str(Path(__file__).parent.parent / "testdata" / "tiff"),
f"tiff_dataset_{time_str}",
"tiff_dataset_upload",
voxel_size=(12, 12, 12),
)
dataset.compress()
Expand Down
6 changes: 5 additions & 1 deletion webknossos/local_wk_setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,12 @@ function export_vars {

function ensure_local_test_wk {
export_vars
WK_DOCKER_DIR="tests"

if ! curl -sf localhost:9000/api/health; then
echo "Using docker compose setup with the docker tag $DOCKER_TAG"
echo " To change this, please update DOCKER_TAG in local_wk_setup.sh"

WK_DOCKER_DIR="tests"
pushd $WK_DOCKER_DIR > /dev/null
docker compose pull webknossos
if [ ! -d binaryData/Organization_X/l4_sample ]; then
Expand All @@ -34,6 +34,10 @@ function ensure_local_test_wk {
while ! curl -sf localhost:9000/api/health; do
sleep 5
done
# docker compose exec -T --user root webknossos sh -c "echo name,url,publicUrl,key,isScratch,isDeleted,allowsUpload,onlyAllowedOrganization,reportUsedStorageEnabled\n'http://localhost','http://localhost:9000','http://localhost:3000','something-secure',f,f,t,,f > test/db/dataStores.csv"
# docker compose exec -T --user root webknossos sh -c "sed -i 's|publicUri = \${http.uri}|publicUri = \"localhost:3000\"|' conf/application.conf"
docker compose exec -T --user root webknossos sh -c "sed -i \"s|f,t,'l4_sample'|t,t,'l4_sample'|\" test/db/dataSets.csv"
# docker compose exec -T --user root webknossos sh -c "cat test/db/dataSets.csv"
OUT=$(docker compose exec -T webknossos tools/postgres/dbtool.js prepare-test-db 2>&1) || echo "$OUT"
popd > /dev/null
else
Expand Down
4 changes: 1 addition & 3 deletions webknossos/pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -96,8 +96,6 @@ dev-dependencies = [
"mypy ~=1.10.0",
"pytest ~=8.3.2",
"pytest-custom-exit-code ~=0.3.0",
"pytest-recording ~=0.13.0",
"vcrpy ~=6.0",
"pytest-split ~=0.9.0",
"pytest-sugar ~=1.0.0",
"pytest-timeout ~=2.3.0",
Expand Down Expand Up @@ -179,7 +177,7 @@ indent-style = "space"
line-ending = "auto"

[tool.pytest.ini_options]
markers = ["with_vcr: Runs with VCR recording and optionally blocked network"]
markers = ["use_proxay: Runs with a proxay instance recording/replaying http responses", "serial"]
testpaths = ["tests"]

[build-system]
Expand Down
47 changes: 36 additions & 11 deletions webknossos/test.sh
Original file line number Diff line number Diff line change
Expand Up @@ -3,31 +3,56 @@ set -eEuo pipefail

source local_wk_setup.sh

export_vars


# Note that pytest should be executed via `python -m`, since
# this will ensure that the current directory is added to sys.path
# (which is standard python behavior). This is necessary so that the imports
# refer to the checked out (and potentially modified) code.
PYTEST="uv run --all-extras --frozen python -m pytest --suppress-no-test-exit-code"

find tests/binaryData/Organization_X -mindepth 1 -maxdepth 1 -type d ! -name 'l4_sample' ! -name 'e2006_knossos' -exec rm -rf {} +
markbader marked this conversation as resolved.
Show resolved Hide resolved


if [ $# -gt 0 ] && [ "$1" = "--refresh-snapshots" ]; then
ensure_local_test_wk

rm -rf tests/cassettes
rm -rf tests/**/cassettes

# Starts a proxy server in record mode on port 3000 and sets the HTTP_PROXY env var
proxay --mode record --host http://localhost:9000 --tapes-dir tests/cassettes &
PROXAY_PID=$!

shift
$PYTEST --record-mode once -m "with_vcr" "$@"
stop_local_test_wk
elif [ $# -gt 0 ] && [ "$1" = "--add-snapshots" ]; then
ensure_local_test_wk
shift
$PYTEST --record-mode once -m "with_vcr" "$@"
$PYTEST "-m" "use_proxay" "$@"
PYTEST_EXIT_CODE=$?

# Kill the proxy server
kill $PROXAY_PID
wait $PROXAY_PID

stop_local_test_wk

exit $PYTEST_EXIT_CODE
elif [ $# -gt 0 ] && [ "$1" = "--debug-cassettes" ]; then
# This will start a proxay server in replay mode so that the stored cassettes can be used for debugging tests.

export_vars

proxay --mode replay --tapes-dir tests/cassettes &
PROXAY_PID=$!
echo "Proxay server is running in replay mode. Press Ctrl+C to stop."
trap 'kill $PROXAY_PID; exit' INT
wait $PROXAY_PID
else
$PYTEST --block-network -m "with_vcr" "$@"
export_vars

proxay --mode replay --tapes-dir tests/cassettes 2>&1 > /dev/null &
PROXAY_PID=$!

$PYTEST "$@"
PYTEST_EXIT_CODE=$?

kill $PROXAY_PID

exit $PYTEST_EXIT_CODE
fi
$PYTEST --disable-recording -m "not with_vcr" "$@"
markbader marked this conversation as resolved.
Show resolved Hide resolved
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
Loading
Loading