diff --git a/docs/user_guide/examples_v3/tutorial_diffusion.ipynb b/docs/user_guide/examples/tutorial_diffusion.ipynb similarity index 75% rename from docs/user_guide/examples_v3/tutorial_diffusion.ipynb rename to docs/user_guide/examples/tutorial_diffusion.ipynb index aef9bb06f..0c0013765 100644 --- a/docs/user_guide/examples_v3/tutorial_diffusion.ipynb +++ b/docs/user_guide/examples/tutorial_diffusion.ipynb @@ -74,7 +74,11 @@ "\n", "The extra term in the M1 scheme provides extra accuracy at negligible computational cost.\n", "\n", - "The spatial derivatives in the EM and M1 schemes can be approximated by a central difference. Higher order numerical schemes (see [Gräwe et al., 2012](https://doi.org/10.1007/s10236-012-0523-y)) include higher order derivatives. Since Parcels uses bilinear interpolation, these higher order derivatives cannot be computed, meaning that higher order numerical schemes cannot be used.\n", + "The spatial derivatives in the EM and M1 schemes can be approximated by a central difference. Higher order numerical schemes (see [Gräwe et al., 2012](https://doi.org/10.1007/s10236-012-0523-y)) include higher order derivatives.\n", + "\n", + "```{note}\n", + "TODO: explore higher order derivatives and numerical schemes (using cubic spline interpolation) in v4 \n", + "```\n", "\n", "An overview of numerical approximations for SDEs in a particle tracking setting can be found in [Gräwe (2011)](https://doi.org/10.1016/j.ocemod.2010.10.002).\n", "\n", @@ -82,11 +86,11 @@ "\n", "The EM and M1 advection-diffusion approximations are available as `AdvectionDiffusionEM` and `AdvectionDiffusionM1`, respectively. The `AdvectionDiffusionM1` kernel should be the default choice, as the increased accuracy comes at negligible computational cost.\n", "\n", - "The advection component of these kernels is similar to that of the Explicit Euler advection kernel (`AdvectionEE`). In the special case where diffusivity is constant over the entire domain, the diffusion-only kernel `DiffusionUniformKh` can be used in combination with an advection kernel of choice. Since the diffusivity here is space-independent, gradients are not calculated, increasing efficiency. The diffusion-step can in this case be computed after or before advection, thus allowing you to chain kernels using the `+` operator.\n", + "The advection component of these kernels is similar to that of the Explicit Euler advection kernel (`AdvectionEE`). In the special case where diffusivity is constant over the entire domain, the diffusion-only kernel `DiffusionUniformKh` can be used in combination with an advection kernel of choice. Since the diffusivity here is space-independent, gradients are not calculated, increasing efficiency. The diffusion-step can in this case be computed after or before advection, thus allowing you to chain kernels in a list.\n", "\n", "Just like velocities, diffusivities are passed to Parcels in the form of `Field` objects. When using `DiffusionUniformKh`, they should be added to the `FieldSet` object as constant fields, e.g. `fieldset.add_constant_field(\"Kh_zonal\", 1, mesh=\"flat\")`.\n", "\n", - "To make a central difference approximation for computing the gradient in diffusivity, a resolution for this approximation `dres` is needed: _Parcels_ approximates the gradients in diffusivities by using their values at the particle's location ± `dres` (in both $x$ and $y$). A value of `dres` must be specified and added to the FieldSet by the user (e.g. `fieldset.add_constant(\"dres\", 0.01)`). Currently, it is unclear what the best value of `dres` is. From experience, its size of `dres` should be smaller than the spatial resolution of the data, but within reasonable limits of machine precision to avoid numerical errors. We are working on a method to compute gradients differently so that specifying `dres` is not necessary anymore.\n", + "To make a central difference approximation for computing the gradient in diffusivity, a resolution for this approximation `dres` is needed: _Parcels_ approximates the gradients in diffusivities by using their values at the particle's location ± `dres` (in both $x$ and $y$). A value of `dres` must be specified and added to the FieldSet by the user (e.g. `fieldset.add_constant(\"dres\", 0.01)`). Currently, it is unclear what the best value of `dres` is. From experience, the size of `dres` should be smaller than the spatial resolution of the data, but within reasonable limits of machine precision to avoid numerical errors. We are working on a method to compute gradients differently so that specifying `dres` is not necessary anymore.\n", "\n", "## Example: Impermeable Diffusivity Profile\n", "\n", @@ -110,17 +114,12 @@ "metadata": {}, "outputs": [], "source": [ - "import random\n", - "from datetime import timedelta\n", - "\n", "import matplotlib.pyplot as plt\n", "import numpy as np\n", "import trajan as ta\n", "import xarray as xr\n", "\n", - "import parcels\n", - "from parcels.grid import GridType\n", - "from parcels.tools.converters import Geographic, GeographicPolar, UnitConverter" + "import parcels" ] }, { @@ -191,17 +190,35 @@ "metadata": {}, "outputs": [], "source": [ - "xdim, ydim = (1, Ny)\n", - "data = {\n", - " \"U\": np.zeros(ydim),\n", - " \"V\": np.zeros(ydim),\n", - " \"Kh_zonal\": K_bar * np.ones(ydim),\n", - " \"Kh_meridional\": Kh_meridional,\n", - "}\n", - "dims = {\"lon\": 1, \"lat\": np.linspace(-0.01, 1.01, ydim, dtype=np.float32)}\n", - "fieldset = parcels.FieldSet.from_data(\n", - " data, dims, mesh=\"flat\", allow_time_extrapolation=True\n", + "from parcels._datasets.structured.generated import simple_UV_dataset\n", + "\n", + "ds = simple_UV_dataset(dims=(1, 1, Ny, 1), mesh=\"flat\").isel(time=0, depth=0)\n", + "ds[\"lat\"][:] = np.linspace(-0.01, 1.01, Ny)\n", + "ds[\"lon\"][:] = np.ones(len(ds.XG))\n", + "ds[\"Kh_meridional\"] = ([\"YG\", \"XG\"], Kh_meridional[:, None])\n", + "ds" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "grid = parcels.XGrid.from_dataset(ds, mesh=\"flat\")\n", + "U = parcels.Field(\"U\", ds[\"U\"], grid, interp_method=parcels.interpolators.XLinear)\n", + "V = parcels.Field(\"V\", ds[\"V\"], grid, interp_method=parcels.interpolators.XLinear)\n", + "UV = parcels.VectorField(\"UV\", U, V)\n", + "\n", + "Kh_meridional_field = parcels.Field(\n", + " \"Kh_meridional\",\n", + " ds[\"Kh_meridional\"],\n", + " grid,\n", + " interp_method=parcels.interpolators.XLinear,\n", ")\n", + "fieldset = parcels.FieldSet([U, V, UV, Kh_meridional_field])\n", + "fieldset.add_constant_field(\"Kh_zonal\", 1, mesh=\"flat\")\n", + "\n", "fieldset.add_constant(\"dres\", 0.00005)" ] }, @@ -220,13 +237,12 @@ "outputs": [], "source": [ "def get_test_particles():\n", - " return parcels.ParticleSet.from_list(\n", + " return parcels.ParticleSet(\n", " fieldset,\n", " pclass=parcels.Particle,\n", " lon=np.zeros(100),\n", " lat=np.ones(100) * 0.75,\n", - " time=np.zeros(100),\n", - " lonlatdepth_dtype=np.float64,\n", + " time=np.repeat(np.timedelta64(0, \"s\"), 100),\n", " )" ] }, @@ -235,25 +251,31 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Now we will simulate the advection and diffusion of the particles using the `AdvectionDiffusionM1` kernel. We run the simulation for 0.3 seconds, with a numerical timestep $\\Delta t = 0.001$s. We also write away particle locations at each timestep for plotting. Note that this will hinder a runtime comparison between kernels, since it will cause most time to be spent on I/O.\n" + "Now we will simulate the advection and diffusion of the particles using the `AdvectionDiffusionM1` kernel. We run the simulation for 0.3 seconds, with a numerical timestep $\\Delta t = 0.001$ s. We also write away particle locations at each timestep for plotting. Note that this will hinder a runtime comparison between kernels, since it will cause most time to be spent on I/O.\n" ] }, { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "tags": [ + "hide-output" + ] + }, "outputs": [], "source": [ - "dt = 0.001\n", "testParticles = get_test_particles()\n", - "output_file = testParticles.ParticleFile(\n", - " name=\"M1_out.zarr\", chunks=(len(testParticles), 50), outputdt=timedelta(seconds=dt)\n", + "testParticles.update_dt_dtype(\"timedelta64[ms]\")\n", + "output_file = parcels.ParticleFile(\n", + " store=\"M1_out.zarr\",\n", + " chunks=(len(testParticles), 50),\n", + " outputdt=np.timedelta64(1, \"ms\"),\n", ")\n", "\n", "testParticles.execute(\n", " parcels.kernels.AdvectionDiffusionM1,\n", - " runtime=timedelta(seconds=0.3),\n", - " dt=timedelta(seconds=dt),\n", + " runtime=np.timedelta64(300, \"ms\"),\n", + " dt=np.timedelta64(1, \"ms\"),\n", " output_file=output_file,\n", ")" ] @@ -315,19 +337,25 @@ { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "tags": [ + "hide-output" + ] + }, "outputs": [], "source": [ - "dt = 0.001\n", "testParticles = get_test_particles()\n", - "output_file = testParticles.ParticleFile(\n", - " name=\"EM_out.zarr\", chunks=(len(testParticles), 50), outputdt=timedelta(seconds=dt)\n", + "testParticles.update_dt_dtype(\"timedelta64[ms]\")\n", + "output_file = parcels.ParticleFile(\n", + " store=\"EM_out.zarr\",\n", + " chunks=(len(testParticles), 50),\n", + " outputdt=np.timedelta64(1, \"ms\"),\n", ")\n", - "random.seed(1636) # Random seed for reproducibility\n", + "np.random.seed(1636) # Random seed for reproducibility\n", "testParticles.execute(\n", " parcels.kernels.AdvectionDiffusionEM,\n", - " runtime=timedelta(seconds=0.3),\n", - " dt=timedelta(seconds=dt),\n", + " runtime=np.timedelta64(300, \"ms\"),\n", + " dt=np.timedelta64(1, \"ms\"),\n", " output_file=output_file,\n", ")" ] @@ -402,13 +430,22 @@ "metadata": {}, "outputs": [], "source": [ - "def smagdiff(particle, fieldset, time):\n", + "def smagdiff(particles, fieldset):\n", + " dt = particles.dt / np.timedelta64(1, \"s\")\n", " dx = 0.01\n", " # gradients are computed by using a local central difference.\n", - " updx, vpdx = fieldset.UV[time, particle.depth, particle.lat, particle.lon + dx]\n", - " umdx, vmdx = fieldset.UV[time, particle.depth, particle.lat, particle.lon - dx]\n", - " updy, vpdy = fieldset.UV[time, particle.depth, particle.lat + dx, particle.lon]\n", - " umdy, vmdy = fieldset.UV[time, particle.depth, particle.lat - dx, particle.lon]\n", + " updx, vpdx = fieldset.UV[\n", + " particles.time, particles.z, particles.lat, particles.lon + dx, particles\n", + " ]\n", + " umdx, vmdx = fieldset.UV[\n", + " particles.time, particles.z, particles.lat, particles.lon - dx, particles\n", + " ]\n", + " updy, vpdy = fieldset.UV[\n", + " particles.time, particles.z, particles.lat + dx, particles.lon, particles\n", + " ]\n", + " umdy, vmdy = fieldset.UV[\n", + " particles.time, particles.z, particles.lat - dx, particles.lon, particles\n", + " ]\n", "\n", " dudx = (updx - umdx) / (2 * dx)\n", " dudy = (updy - umdy) / (2 * dx)\n", @@ -416,16 +453,15 @@ " dvdx = (vpdx - vmdx) / (2 * dx)\n", " dvdy = (vpdy - vmdy) / (2 * dx)\n", "\n", - " A = fieldset.cell_areas[time, 0, particle.lat, particle.lon]\n", - " sq_deg_to_sq_m = (1852 * 60) ** 2 * math.cos(particle.lat * math.pi / 180)\n", + " A = fieldset.cell_areas[particles.time, particles.z, particles.lat, particles.lon]\n", + " sq_deg_to_sq_m = (1852 * 60) ** 2 * np.cos(particles.lat * np.pi / 180)\n", " A = A / sq_deg_to_sq_m\n", - " Kh = fieldset.Cs * A * math.sqrt(dudx**2 + 0.5 * (dudy + dvdx) ** 2 + dvdy**2)\n", - "\n", - " dlat = random.normalvariate(0.0, 1.0) * math.sqrt(2 * math.fabs(particle.dt) * Kh)\n", - " dlon = random.normalvariate(0.0, 1.0) * math.sqrt(2 * math.fabs(particle.dt) * Kh)\n", + " Kh = fieldset.Cs * A * np.sqrt(dudx**2 + 0.5 * (dudy + dvdx) ** 2 + dvdy**2)\n", "\n", - " particle_dlat += dlat\n", - " particle_dlon += dlon" + " dlat = np.random.normal(0.0, 1.0, dt.shape) * np.sqrt(2 * np.fabs(dt) * Kh)\n", + " dlon = np.random.normal(0.0, 1.0, dt.shape) * np.sqrt(2 * np.fabs(dt) * Kh)\n", + " particles.dlat += dlat\n", + " particles.dlon += dlon" ] }, { @@ -433,7 +469,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Reading velocity fields from netcdf files\n" + "Reading velocity fields from a Copernicus Marine Service dataset:\n" ] }, { @@ -442,18 +478,15 @@ "metadata": {}, "outputs": [], "source": [ - "example_dataset_folder = parcels.download_example_dataset(\"GlobCurrent_example_data\")\n", - "filenames = {\n", - " \"U\": f\"{example_dataset_folder}/20*.nc\",\n", - " \"V\": f\"{example_dataset_folder}/20*.nc\",\n", - "}\n", - "variables = {\n", - " \"U\": \"eastward_eulerian_current_velocity\",\n", - " \"V\": \"northward_eulerian_current_velocity\",\n", - "}\n", - "dimensions = {\"lat\": \"lat\", \"lon\": \"lon\", \"time\": \"time\"}\n", - "\n", - "fieldset = parcels.FieldSet.from_netcdf(filenames, variables, dimensions)" + "# Load the CopernicusMarine data in the Agulhas region from the example_datasets\n", + "example_dataset_folder = parcels.download_example_dataset(\n", + " \"CopernicusMarine_data_for_Argo_tutorial\"\n", + ")\n", + "\n", + "ds_fields = xr.open_mfdataset(\n", + " f\"{example_dataset_folder}/*.nc\", combine=\"by_coords\"\n", + ").isel(depth=slice(0, 1))\n", + "ds_fields.load() # load the dataset into memory" ] }, { @@ -470,45 +503,49 @@ "metadata": {}, "outputs": [], "source": [ - "x = fieldset.U.grid.lon\n", - "y = fieldset.U.grid.lat\n", + "fieldset = parcels.FieldSet.from_copernicusmarine(ds_fields)\n", + "\n", + "\n", + "def degree_lat_to_meter(d):\n", + " return d * 1000.0 * 1.852 * 60.0\n", + "\n", "\n", + "def degree_lon_to_meter(d, lat):\n", + " return d * 1000.0 * 1.852 * 60.0 * np.cos(lat * np.pi / 180)\n", "\n", - "def calc_cell_edge_sizes(grid):\n", - " \"\"\"Calculate cell sizes based on numpy.gradient method.\n", "\n", - " Currently only works for Rectilinear Grids. Operates in place adding a `cell_edge_sizes`\n", - " attribute to the grid.\n", - " \"\"\"\n", - " assert grid._gtype in (GridType.RectilinearZGrid, GridType.RectilinearSGrid), (\n", - " f\"_cell_edge_sizes() not implemented for {grid._gtype} grids. \"\n", - " \"You can provide cell_edge_sizes yourself by in, e.g., \"\n", - " \"NEMO using the e1u fields etc from the mesh_mask.nc file.\"\n", - " )\n", + "def calc_cell_areas(ds):\n", + " \"\"\"calculate cell areas for rectilinear grids\"\"\"\n", + " lon, lat = ds[\"longitude\"], ds[\"latitude\"]\n", + " assert \"degrees\" in lon.attrs[\"units\"]\n", + " assert \"degrees\" in lat.attrs[\"units\"]\n", "\n", - " cell_edge_sizes_x = np.zeros((grid.ydim, grid.xdim), dtype=np.float32)\n", - " cell_edge_sizes_y = np.zeros((grid.ydim, grid.xdim), dtype=np.float32)\n", + " LON, LAT = np.meshgrid(lon, lat)\n", + " X, Y = degree_lon_to_meter(LON, LAT), degree_lat_to_meter(LAT)\n", "\n", - " x_conv = GeographicPolar() if grid.mesh == \"spherical\" else UnitConverter()\n", - " y_conv = Geographic() if grid.mesh == \"spherical\" else UnitConverter()\n", - " for y, (lat, dy) in enumerate(zip(grid.lat, np.gradient(grid.lat), strict=False)):\n", - " for x, (lon, dx) in enumerate(\n", - " zip(grid.lon, np.gradient(grid.lon), strict=False)\n", - " ):\n", - " cell_edge_sizes_x[y, x] = x_conv.to_source(dx, grid.depth[0], lat, lon)\n", - " cell_edge_sizes_y[y, x] = y_conv.to_source(dy, grid.depth[0], lat, lon)\n", - " return cell_edge_sizes_x, cell_edge_sizes_y\n", + " dX = np.gradient(X, axis=1)\n", + " dY = np.gradient(Y, axis=0)\n", + " cell_areas = dX * dY\n", "\n", + " return cell_areas\n", "\n", - "def calc_cell_areas(grid):\n", - " cell_edge_sizes_x, cell_edge_sizes_y = calc_cell_edge_sizes(grid)\n", - " return cell_edge_sizes_x * cell_edge_sizes_y\n", "\n", + "da_cell_areas = xr.DataArray(\n", + " data=calc_cell_areas(ds_fields),\n", + " coords=dict(\n", + " latitude=([\"lat\"], ds_fields.latitude.values),\n", + " longitude=([\"lon\"], ds_fields.longitude.values),\n", + " ),\n", + " dims=[\"lat\", \"lon\"],\n", + ")\n", "\n", - "cell_areas = parcels.Field(\n", - " name=\"cell_areas\", data=calc_cell_areas(fieldset.U.grid), lon=x, lat=y\n", + "cell_areas_field = parcels.Field(\n", + " \"cell_areas\",\n", + " da_cell_areas,\n", + " fieldset.U.grid,\n", + " interp_method=parcels.interpolators.XNearest,\n", ")\n", - "fieldset.add_field(cell_areas)\n", + "fieldset.add_field(cell_areas_field)\n", "\n", "fieldset.add_constant(\"Cs\", 0.1)" ] @@ -518,52 +555,42 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "In the example, particles are released at one location periodically (every hour) for 12 hours\n" + "In the example, particles are released at one location and simulated for 2 days using advection (`AdvectionRK4`) and diffusion (`smagdiff`) kernels." ] }, { "cell_type": "code", "execution_count": null, - "metadata": {}, + "metadata": { + "tags": [ + "hide-output" + ] + }, "outputs": [], "source": [ - "time = np.arange(0, 43200, 3600)\n", - "lon = [29] * len(time)\n", - "lat = [-33] * len(time)\n", + "time = np.repeat(fieldset.time_interval.left, 13)\n", + "lon = [32] * len(time)\n", + "lat = [-30.1] * len(time)\n", + "z = [ds_fields.depth[0]] * len(time)\n", "pset = parcels.ParticleSet(\n", - " fieldset=fieldset, pclass=parcels.Particle, lon=lon, lat=lat, time=time\n", - ")" - ] - }, - { - "attachments": {}, - "cell_type": "markdown", - "metadata": {}, - "source": [ - "Simulating the particles moving during 7 days using advection (`AdvectionRK4`) and diffusion (`smagdiff`) kernels.\n" - ] - }, - { - "cell_type": "code", - "execution_count": null, - "metadata": {}, - "outputs": [], - "source": [ - "def DeleteParticle(particle, fieldset, time):\n", - " if particle.state == parcels.StatusCode.ErrorOutOfBounds:\n", - " particle.delete()\n", - "\n", + " fieldset=fieldset,\n", + " pclass=parcels.Particle,\n", + " time=time,\n", + " z=z,\n", + " lat=lat,\n", + " lon=lon,\n", + ")\n", "\n", - "output_file = pset.ParticleFile(\n", - " name=\"Global_smagdiff.zarr\", outputdt=timedelta(hours=3), chunks=(1, 57)\n", + "output_file = parcels.ParticleFile(\n", + " store=\"smagdiff.zarr\", outputdt=np.timedelta64(1, \"h\"), chunks=(1, 57)\n", ")\n", "\n", - "random.seed(1636) # Random seed for reproducibility\n", + "np.random.seed(1636) # Random seed for reproducibility\n", "\n", "pset.execute(\n", - " [parcels.kernels.AdvectionRK4, smagdiff, DeleteParticle],\n", - " runtime=timedelta(days=7),\n", - " dt=timedelta(minutes=5),\n", + " [parcels.kernels.AdvectionRK4, smagdiff],\n", + " runtime=np.timedelta64(2, \"D\"),\n", + " dt=np.timedelta64(5, \"m\"),\n", " output_file=output_file,\n", ")" ] @@ -573,7 +600,7 @@ "cell_type": "markdown", "metadata": {}, "source": [ - "Visualize the trajectories using the trajan package\n" + "When we visualize the trajectories using the trajan package, we can see that the particles released at the same time and location spread out over time:\n" ] }, { @@ -582,9 +609,15 @@ "metadata": {}, "outputs": [], "source": [ - "ds = xr.open_zarr(\"Global_smagdiff.zarr\")\n", + "ds_particles = xr.open_zarr(\"smagdiff.zarr\")\n", "\n", - "ds.traj.plot()\n", + "temperature = ds_fields.isel(time=0, depth=0).thetao.plot(cmap=\"magma\")\n", + "velocity = ds_fields.isel(time=0, depth=0).plot.quiver(\n", + " x=\"longitude\", y=\"latitude\", u=\"uo\", v=\"vo\"\n", + ")\n", + "particles = ds_particles.traj.plot(color=\"blue\")\n", + "plt.ylim(-31, -30)\n", + "plt.xlim(31, 32.1)\n", "plt.show()" ] }, @@ -614,7 +647,7 @@ ], "metadata": { "kernelspec": { - "display_name": "parcels-dev", + "display_name": "test-notebooks", "language": "python", "name": "python3" }, @@ -628,7 +661,7 @@ "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", - "version": "3.10.15" + "version": "3.11.0" } }, "nbformat": 4, diff --git a/docs/user_guide/index.md b/docs/user_guide/index.md index 1c456403c..fc6878acd 100644 --- a/docs/user_guide/index.md +++ b/docs/user_guide/index.md @@ -49,6 +49,7 @@ examples/tutorial_sampling.ipynb examples/tutorial_statuscodes.ipynb examples/tutorial_gsw_density.ipynb examples/tutorial_Argofloats.ipynb +examples/tutorial_diffusion.ipynb ``` ```{toctree} @@ -59,7 +60,6 @@ examples/explanation_interpolation.md examples/tutorial_interpolation.ipynb ``` -