Skip to content

Commit 2127aa5

Browse files
authored
Merge branch 'main' into async-dtreec
2 parents a711167 + 9aee8b7 commit 2127aa5

25 files changed

+644
-63
lines changed

.github/workflows/ci-additional.yaml

Lines changed: 24 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -244,10 +244,8 @@ jobs:
244244
min-version-policy:
245245
name: Minimum Version Policy
246246
runs-on: "ubuntu-latest"
247-
needs: detect-ci-trigger
248-
# min-version-policy doesn't work with Pixi yet https://github.com/pydata/xarray/pull/10888#discussion_r2504335457
249-
if: false
250-
# if: needs.detect-ci-trigger.outputs.triggered == 'false'
247+
needs: [detect-ci-trigger, cache-pixi-lock]
248+
if: needs.detect-ci-trigger.outputs.triggered == 'false'
251249
defaults:
252250
run:
253251
shell: bash -l {0}
@@ -260,18 +258,30 @@ jobs:
260258
with:
261259
fetch-depth: 0 # Fetch all history for all branches and tags.
262260

263-
- uses: actions/setup-python@v6
261+
- name: Restore cached pixi lockfile
262+
uses: actions/cache/restore@v5
263+
id: restore-pixi-lock
264264
with:
265-
python-version: "3.x"
265+
enableCrossOsArchive: true
266+
path: |
267+
pixi.lock
268+
key: ${{ needs.cache-pixi-lock.outputs.cache-id }}
266269

267-
- name: All-deps minimum versions policy
268-
uses: xarray-contrib/minimum-dependency-versions@3db8e1c17328ee1e27dfe4db90d908644856eb61 # v1.0.0
270+
- uses: prefix-dev/[email protected]
269271
with:
270-
policy: ci/policy.yaml
271-
environment-paths: ci/requirements/min-all-deps.yml
272+
pixi-version: ${{ env.PIXI_VERSION }}
273+
cache: true
274+
environments: "policy"
275+
cache-write: ${{ github.event_name == 'push' && github.ref_name == 'main' }}
272276

273277
- name: Bare minimum versions policy
274-
uses: xarray-contrib/minimum-dependency-versions@3db8e1c17328ee1e27dfe4db90d908644856eb61 # v1.0.0
275-
with:
276-
policy: ci/policy.yaml
277-
environment-paths: ci/requirements/bare-minimum.yml
278+
run: |
279+
pixi run policy-bare-minimum
280+
281+
- name: Bare minimum and scipy versions policy
282+
run: |
283+
pixi run policy-bare-min-and-scipy
284+
285+
- name: All-deps minimum versions policy
286+
run: |
287+
pixi run policy-min-versions

.github/workflows/nightly-wheels.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ jobs:
3838
fi
3939
4040
- name: Upload wheel
41-
uses: scientific-python/upload-nightly-action@b36e8c0c10dbcfd2e05bf95f17ef8c14fd708dbf # 0.6.2
41+
uses: scientific-python/upload-nightly-action@5748273c71e2d8d3a61f3a11a16421c8954f9ecf # 0.6.3
4242
with:
4343
anaconda_nightly_upload_token: ${{ secrets.ANACONDA_NIGHTLY }}
4444
artifacts_path: dist

ci/policy.yaml

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,5 +24,8 @@ policy:
2424
- pytest-xdist
2525
- pytest-hypothesis
2626
- hypothesis
27+
- pytz
28+
- pytest-reportlog
2729
# these packages don't fail the CI, but will be printed in the report
28-
ignored_violations: []
30+
ignored_violations:
31+
- array-api-strict

doc/Makefile

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ help:
2525
@echo " singlehtml to make a single large HTML file"
2626
@echo " pickle to make pickle files"
2727
@echo " json to make JSON files"
28-
@echo " htmlhelp to make HTML files and a HTML help project"
28+
@echo " htmlhelp to make HTML files and an HTML help project"
2929
@echo " qthelp to make HTML files and a qthelp project"
3030
@echo " applehelp to make an Apple Help Book"
3131
@echo " devhelp to make HTML files and a Devhelp project"

doc/user-guide/testing.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -178,7 +178,7 @@ numpy array (so-called "duck array wrapping", see :ref:`wrapping numpy-like arra
178178
Imagine we want to write a strategy which generates arbitrary ``Variable`` objects, each of which wraps a
179179
:py:class:`sparse.COO` array instead of a ``numpy.ndarray``. How could we do that? There are two ways:
180180

181-
1. Create a xarray object with numpy data and use the hypothesis' ``.map()`` method to convert the underlying array to a
181+
1. Create an xarray object with numpy data and use the hypothesis' ``.map()`` method to convert the underlying array to a
182182
different type:
183183

184184
.. jupyter-execute::

doc/whats-new.rst

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -49,6 +49,9 @@ Bug Fixes
4949
``np.isclose`` by default to handle accumulated floating point errors from
5050
slicing operations. Use ``exact=True`` for exact comparison (:pull:`11035`).
5151
By `Ian Hunt-Isaak <https://github.com/ianhi>`_.
52+
- Ensure the :py:class:`~xarray.groupers.SeasonResampler` preserves the datetime
53+
unit of the underlying time index when resampling (:issue:`11048`,
54+
:pull:`11049`). By `Spencer Clark <https://github.com/spencerkclark>`_.
5255

5356
Documentation
5457
~~~~~~~~~~~~~

pixi.toml

Lines changed: 42 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -79,6 +79,8 @@ seaborn = "*"
7979
[feature.extras.dependencies]
8080
# array
8181
sparse = "*"
82+
pint = "*"
83+
array-api-strict = "*"
8284

8385
# algorithms
8486
scipy = "*"
@@ -87,9 +89,10 @@ toolz = "*"
8789
# tutorial
8890
pooch = "*"
8991

90-
# other
92+
# calendar
9193
cftime = "*"
92-
pint = "*"
94+
95+
# other
9396
iris = "*"
9497

9598
[feature.extras.pypi-dependencies]
@@ -106,11 +109,6 @@ pandas = "2.2.*"
106109
scipy = "1.13.*"
107110

108111
[feature.min-versions.dependencies]
109-
# minimal versions for all dependencies
110-
# Note that when you update min-supported versions, you should:
111-
# - Update the min version lower-bound in the corresponding feature(s) where applicable
112-
# - Update this section to pin to the min version
113-
114112
array-api-strict = "2.4.*" # dependency for testing the array api compat
115113
boto3 = "1.34.*"
116114
bottleneck = "1.4.*"
@@ -204,7 +202,6 @@ cartopy = "*"
204202
seaborn = "*"
205203

206204
[feature.test.dependencies]
207-
array-api-strict = "*"
208205
pytest = "*"
209206
pytest-asyncio = "*"
210207
pytest-cov = "*"
@@ -292,6 +289,42 @@ cytoolz = "*"
292289
[feature.release.tasks]
293290
release-contributors = "python ci/release_contributors.py"
294291

292+
[feature.policy.pypi-dependencies]
293+
xarray-minimum-dependency-policy = "*"
294+
295+
[feature.policy.dependencies]
296+
python = "3.13.*"
297+
298+
[feature.policy.tasks.check-policy]
299+
cmd = "minimum-versions validate --policy ci/policy.yaml --manifest-path pixi.toml {{ env }}"
300+
args = ["env"]
301+
302+
[feature.policy.tasks]
303+
policy-bare-minimum = [
304+
{ task = "check-policy", args = [
305+
"pixi:test-py311-bare-minimum",
306+
] },
307+
]
308+
policy-bare-min-and-scipy = [
309+
{ task = "check-policy", args = [
310+
"pixi:test-py311-bare-min-and-scipy",
311+
] },
312+
]
313+
policy-min-versions = [
314+
{ task = "check-policy", args = [
315+
"pixi:test-py311-min-versions",
316+
] },
317+
]
318+
policy = [
319+
{ task = "check-policy", args = [
320+
"""\
321+
pixi:test-py311-bare-minimum \
322+
pixi:test-py311-bare-min-and-scipy \
323+
pixi:test-py311-min-versions \
324+
""",
325+
] },
326+
]
327+
295328
[environments]
296329
# Testing
297330
# test-just-xarray = { features = ["test"] } # https://github.com/pydata/xarray/pull/10888/files#r2511336147
@@ -392,3 +425,4 @@ doc = { features = [
392425
] }
393426
pre-commit = { features = ["pre-commit"], no-default-feature = true }
394427
release = { features = ["release"], no-default-feature = true }
428+
policy = { features = ["policy"], no-default-feature = true }
Lines changed: 135 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,135 @@
1+
"""Property tests comparing CoordinateTransformIndex to PandasIndex."""
2+
3+
import functools
4+
import operator
5+
from collections.abc import Hashable
6+
from typing import Any
7+
8+
import numpy as np
9+
import pytest
10+
11+
pytest.importorskip("hypothesis")
12+
13+
import hypothesis.strategies as st
14+
from hypothesis import given
15+
16+
import xarray as xr
17+
import xarray.testing.strategies as xrst
18+
from xarray.core.coordinate_transform import CoordinateTransform
19+
from xarray.core.indexes import CoordinateTransformIndex
20+
from xarray.testing import assert_equal
21+
22+
DATA_VAR_NAME = "_test_data_"
23+
24+
25+
class IdentityTransform(CoordinateTransform):
26+
"""Identity transform that returns dimension positions as coordinate labels."""
27+
28+
def forward(self, dim_positions: dict[str, Any]) -> dict[Hashable, Any]:
29+
return dim_positions
30+
31+
def reverse(self, coord_labels: dict[Hashable, Any]) -> dict[str, Any]:
32+
return coord_labels
33+
34+
def equals(
35+
self, other: CoordinateTransform, exclude: frozenset[Hashable] | None = None
36+
) -> bool:
37+
if not isinstance(other, IdentityTransform):
38+
return False
39+
return self.dim_size == other.dim_size
40+
41+
42+
def create_transform_da(sizes: dict[str, int]) -> xr.DataArray:
43+
"""Create a DataArray with IdentityTransform CoordinateTransformIndex."""
44+
dims = list(sizes.keys())
45+
shape = tuple(sizes.values())
46+
data = np.arange(np.prod(shape)).reshape(shape)
47+
48+
# Create dataset with transform index for each dimension
49+
ds = xr.Dataset({DATA_VAR_NAME: (dims, data)})
50+
indexes = [
51+
xr.Coordinates.from_xindex(
52+
CoordinateTransformIndex(
53+
IdentityTransform((dim,), {dim: size}, dtype=np.dtype(np.int64))
54+
)
55+
)
56+
for dim, size in sizes.items()
57+
]
58+
coords = functools.reduce(operator.or_, indexes)
59+
return ds.assign_coords(coords).get(DATA_VAR_NAME)
60+
61+
62+
def create_pandas_da(sizes: dict[str, int]) -> xr.DataArray:
63+
"""Create a DataArray with standard PandasIndex (range index)."""
64+
shape = tuple(sizes.values())
65+
data = np.arange(np.prod(shape)).reshape(shape)
66+
coords = {dim: np.arange(size) for dim, size in sizes.items()}
67+
return xr.DataArray(
68+
data, dims=list(sizes.keys()), coords=coords, name=DATA_VAR_NAME
69+
)
70+
71+
72+
@given(
73+
st.data(),
74+
xrst.dimension_sizes(min_dims=1, max_dims=3, min_side=1, max_side=5),
75+
)
76+
def test_basic_indexing(data, sizes):
77+
"""Test basic indexing produces identical results for transform and pandas index."""
78+
pandas_da = create_pandas_da(sizes)
79+
transform_da = create_transform_da(sizes)
80+
idxr = data.draw(xrst.basic_indexers(sizes=sizes))
81+
pandas_result = pandas_da.isel(idxr)
82+
transform_result = transform_da.isel(idxr)
83+
# TODO: any indexed dim in pandas_result should be an indexed dim in transform_result
84+
# This requires us to return a new CoordinateTransformIndex from .isel.
85+
# for dim in pandas_result.xindexes:
86+
# assert isinstance(transform_result.xindexes[dim], CoordinateTransformIndex)
87+
assert_equal(pandas_result, transform_result)
88+
89+
# not supported today
90+
# pandas_result = pandas_da.sel(idxr)
91+
# transform_result = transform_da.sel(idxr)
92+
# assert_identical(pandas_result, transform_result)
93+
94+
95+
@given(
96+
st.data(),
97+
xrst.dimension_sizes(min_dims=1, max_dims=3, min_side=1, max_side=5),
98+
)
99+
def test_outer_indexing(data, sizes):
100+
"""Test outer indexing produces identical results for transform and pandas index."""
101+
pandas_da = create_pandas_da(sizes)
102+
transform_da = create_transform_da(sizes)
103+
idxr = data.draw(xrst.outer_array_indexers(sizes=sizes, min_dims=1))
104+
pandas_result = pandas_da.isel(idxr)
105+
transform_result = transform_da.isel(idxr)
106+
assert_equal(pandas_result, transform_result)
107+
108+
label_idxr = {
109+
dim: np.arange(pandas_da.sizes[dim])[ind.data] for dim, ind in idxr.items()
110+
}
111+
pandas_result = pandas_da.sel(label_idxr)
112+
transform_result = transform_da.sel(label_idxr, method="nearest")
113+
assert_equal(pandas_result, transform_result)
114+
115+
116+
@given(
117+
st.data(),
118+
xrst.dimension_sizes(min_dims=2, max_dims=3, min_side=1, max_side=5),
119+
)
120+
def test_vectorized_indexing(data, sizes):
121+
"""Test vectorized indexing produces identical results for transform and pandas index."""
122+
pandas_da = create_pandas_da(sizes)
123+
transform_da = create_transform_da(sizes)
124+
idxr = data.draw(xrst.vectorized_indexers(sizes=sizes))
125+
pandas_result = pandas_da.isel(idxr)
126+
transform_result = transform_da.isel(idxr)
127+
assert_equal(pandas_result, transform_result)
128+
129+
label_idxr = {
130+
dim: ind.copy(data=np.arange(pandas_da.sizes[dim])[ind.data])
131+
for dim, ind in idxr.items()
132+
}
133+
pandas_result = pandas_da.sel(label_idxr, method="nearest")
134+
transform_result = transform_da.sel(label_idxr, method="nearest")
135+
assert_equal(pandas_result, transform_result)

properties/test_indexing.py

Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
import pytest
2+
3+
pytest.importorskip("hypothesis")
4+
5+
import hypothesis.strategies as st
6+
from hypothesis import given
7+
8+
import xarray as xr
9+
import xarray.testing.strategies as xrst
10+
11+
12+
def _slice_size(s: slice, dim_size: int) -> int:
13+
"""Compute the size of a slice applied to a dimension."""
14+
return len(range(*s.indices(dim_size)))
15+
16+
17+
@given(
18+
st.data(),
19+
xrst.variables(dims=xrst.dimension_sizes(min_dims=1, max_dims=4, min_side=1)),
20+
)
21+
def test_basic_indexing(data, var):
22+
"""Test that basic indexers produce expected output shape."""
23+
idxr = data.draw(xrst.basic_indexers(sizes=var.sizes))
24+
result = var.isel(idxr)
25+
expected_shape = tuple(
26+
_slice_size(idxr[d], var.sizes[d]) if d in idxr else var.sizes[d]
27+
for d in result.dims
28+
)
29+
assert result.shape == expected_shape
30+
31+
32+
@given(
33+
st.data(),
34+
xrst.variables(dims=xrst.dimension_sizes(min_dims=1, max_dims=4, min_side=1)),
35+
)
36+
def test_outer_indexing(data, var):
37+
"""Test that outer array indexers produce expected output shape."""
38+
idxr = data.draw(xrst.outer_array_indexers(sizes=var.sizes, min_dims=1))
39+
result = var.isel(idxr)
40+
expected_shape = tuple(
41+
len(idxr[d]) if d in idxr else var.sizes[d] for d in result.dims
42+
)
43+
assert result.shape == expected_shape
44+
45+
46+
@given(
47+
st.data(),
48+
xrst.variables(dims=xrst.dimension_sizes(min_dims=2, max_dims=4, min_side=1)),
49+
)
50+
def test_vectorized_indexing(data, var):
51+
"""Test that vectorized indexers produce expected output shape."""
52+
da = xr.DataArray(var)
53+
idxr = data.draw(xrst.vectorized_indexers(sizes=var.sizes))
54+
result = da.isel(idxr)
55+
56+
# TODO: this logic works because the dims in idxr don't overlap with da.dims
57+
# Compute expected shape from result dims
58+
# Non-indexed dims keep their original size, indexed dims get broadcast size
59+
broadcast_result = xr.broadcast(*idxr.values())
60+
broadcast_sizes = dict(
61+
zip(broadcast_result[0].dims, broadcast_result[0].shape, strict=True)
62+
)
63+
expected_shape = tuple(
64+
var.sizes[d] if d in var.sizes else broadcast_sizes[d] for d in result.dims
65+
)
66+
assert result.shape == expected_shape

0 commit comments

Comments
 (0)