-
Notifications
You must be signed in to change notification settings - Fork 168
Updating Argo tutorial to v4 #2199
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Based on changes needed to the Argo tutorial
And also removing the period at the end, which created parsing issues in VisualStudioCode (which thought it was part of the path)
Note that there are multiple breaking problems in this file
xref #2180 |
This fixes the issue that the cycle_phase is not updated correctly
|
OK, so I found out what is going wrong with the updating of the I also created a minimal breaking example of this in the In this case, the test cases running |
|
A more minimal example would be the following diff --git a/tests/v4/test_particleset_execute.py b/tests/v4/test_particleset_execute.py
index 871a1309..fe945436 100644
--- a/tests/v4/test_particleset_execute.py
+++ b/tests/v4/test_particleset_execute.py
@@ -407,6 +407,18 @@ def test_execution_update_particle_in_kernel_function(fieldset, kernel_names, ex
np.testing.assert_allclose(pset.lat, expected, rtol=1e-5)
+def test_execution_changing_particle_mask_bug(fieldset):
+ npart = 2
+
+ def kernel(particles, fieldset):
+ for i in range(1, 4):
+ print(f"before iteration {i}: {particles.lon=}, {particles._index=}")
+ particles[particles.lon < 0.5]
+
+ pset = ParticleSet(fieldset, lon=np.linspace(0, 1, npart), lat=np.zeros(npart))
+ pset.execute([kernel], runtime=np.timedelta64(2, "s"), dt=np.timedelta64(1, "s"))
+
+
def test_uxstommelgyre_pset_execute():
ds = datasets_unstructured["stommel_gyre_delaunay"]
grid = UxGrid(grid=ds.uxgrid, z=ds.coords["nz"], mesh="spherical")Output
Its to do with |
|
The CI for this PR fails because of the unit test introduced in dcdd2d1. This is intentional though (until we fix #2143), so perhaps we should move that dcdd2d1 commit to another branch and merge this one? @VeckoTheGecko, can you help out if you agree? |
|
Yes, agreed that we should just deal with this in another PR. Can we just do the following rather than moving the test? @pytest.mark.xfail(reason="Will be fixed alongside https://github.com/OceanParcels/Parcels/issues/2143 . Failing due to https://github.com/OceanParcels/Parcels/pull/2199#issuecomment-3285278876 .") |
Since w now have fieldset.from_copernicusmarine(), the loading is much easier. Advantage is that we can then also show temperature sampling!
VeckoTheGecko
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. Note that we are yet to re-enable notebooks in CI - let's do that in a future PR (it becomes much easier to do in CI and locally once #2205 is fixed)
This PR converts the Argo tutorial from v3 to v4.
It still has a few (breaking) issues that will need to be resolved:
ValueError: Failed to decode variable 'time': unable to decode time units 'seconds since 2002-01-01T00:00:00' with "calendar 'standard'". Try opening your dataset with decode_times=False or installing cftime if it is not installedwhich is strange becauseseconds since 2002-01-01T00:00:00seems a perfectly fine time unit?ArgoVerticalMovement()kernel doesn't work. Particle properties, includingcycle_phase, are not updated as expected. This likely has to do with the current implementation ofKernelParticle, which is rather ad-hocGlobCurrent_example_dataas a DataSet, we have to recalculate theds["time"], otherwise we get aValueError: Expected right to be a np.timedelta64, datetime, cftime.datetime, or np.datetime64. Got <class 'numpy.float64'>. Error getting time interval for field 'U'. Are you sure that the time dimension on the xarray dataset is stored as timedelta, datetime or cftime datetime objects?ds.load()is extremely slow (> 15 minutes) compared to the ~70 seconds when the full DataSet is loaded in memory.np.timedeltabecause that gives an error in output_file writing (KeyError: dtype('<m8[s]')inattrs["time"].update(_get_calendar_and_units(time_interval)))FieldSet.from_GlobCurrentor something?)UPDATE: I now swapped to Copernicus Marine data; which solved a few of the major issues above
mainfor v3 changes,v4-devfor v4 changes)