This repository contains the code used to make figures and graphs in the Sacramento San-Joaquin Delta Consumptive Use reports, starting with the interim report (preliminary results for the 2015-2016 water year). Code for 2014-2015 water year report figures were initially produced in Google Sheets but for the final report figures were generated using the R code base included in this repository. Code to make maps is in the repository ssj-data-viz.
As a quick overview of the process, generating graphs and figures involves the following:
- Retrieving data from the various modeling groups
- Processing that data to monthly rasters, as appropriate and uploading those data to Earth Engine
- Running Earth Engine-based scripts that output necessary data tables for graphs
- Downloading those data tables to directories within this repository
- Running a R-Markdown file that generates all of the graphs
- Manually inserting generated graphs into the report
For a detailed overview of the process for creating the graphs as of mid-2017, see this full video training.
R
- ggplot
- dplyr
- plyr
- rjson
- tidyr
- tidyverse
- sf
- jsonlite
- geosphere
- lazyeval
Access to Earth Engine
Again, for a detailed overview of the process for creating the graphs as of mid-2017, see this full video training.
The first step is to obtain data from each modeling group. This process will vary by group and their data format, but some groups will upload their data directly to GitHub, some prefer delivering via FTP, and others prefer sending some other form of link, such as a Google Drive folder. See each individual group repository for more information about this process (a lot of the notes about the process are documented as GitHub issues so make sure to check for in closed issues). This portion of the process isn't usually too complicated, though for groups that deliver via GitHub, be careful that they don't commit daily rasters, especially not in multiple versions as it can use up our GitHub LFS allocation very quickly.
Once the data have been retrieved, they need to be processed into monthly rasters. Some groups will deliver monthly data, so this step can be skipped for those groups, but others will deliver daily data (DisALEXI) or landsat date data (UCD METRIC). Each of these models has a different procedure for generating monthly rasters.
DisAlexi provided daily ET data which is downloaded via a ftp (note: copies of all the raw DisAlexi data are on the CWS servers at X:\ssj-delta-cu\Code\ssj-disalexi
). Please see the notes Upload disalexi data into Earth Engine / Compute Monthly summaries issue for more information about converting the .hdr
rasters and uploading them to an EarthEngine imageCollection. Once all the images are in an EarthEngine imageCollection, use the EarthEngine script ssj-delta-et/ssj-disalexi/daily_to_monthly.js
to create the 12 band monthly ET raster for each water year.
UCD-Metric provided images by landsat date. The EarthEngine script to interpolate the monthly values from the overpass images can be found at ssj-delta-cu/ssj-ucd-metric/landsat_to_monthly.js
.
TODO: this script should be moved over to the ssj-delta-et
repository on EarthEngine that is owned by the CWS shared account. ssj-delta-cu
is depreciated.
Regardless, for all models, data should be uploaded to Earth Engine. We keep Earth Engine data in a shared account ([email protected]) that allows assets to be jointly managed. We do this, rather than uploading to individual accounts and sharing because when a staff changeover occurred, it was a nightmare to get assets needed for model runs reshared. Keeping the assets in a shared account, with a shared username and password in a database managed by CWS It minimizes these problems as we always have someone on staff with direct access to the assets.
All data for the charts and figures are exported from Google Earth Engine as .json file (see et_comparisons.js for more info). These files are then processed in R to standardize data and add additional attributes (ie crop names and the number of days in the month). The tidy results are then saved to disk which is then used to build the plots and figures.
This is the main script for analyzing the monthly ET data. The script runs through all the landuse types and calculates various statistics for a given area of interest (Delta Service Area or Legal Delta). The .json export contains the landuse type, the count of the number of cells in the region of interest, the mean daily ET value for the month, the median ET value, the value of the 9th, 25th, 75th & 91th percentiles. Script creates a task for each model to export, so make sure to start the process in the Tasks tab.
Similar to et_comparisons.js but includes several additional subregions for analysis (South, North, Central, Yolo, West).
Extracts the monthly ET value at each of the fieldstations. This is depreciated since the final report used the daily values for the overpass days (see daily_charts).
Summarizes the agricultural area only (all non-ag landuses are masked) for all 168 DETAW subregions.
Calculates ETrF (EToF) per pixel by dividing the monthly ET by monthly reference ETo. Spatial CIMIS was used as the ETo reference for the models except when the modeling group provided their own reference ETo.
Scripts to produce daily time series for the models that were able to provide additional data for the overpasses and/or the daily interpolated values between the overpass dates. Values are extracted from the raster stack at the 3x3 grid around the field stations as well as for the CIMIS stations. Function can export Rn values as well as ET.
The .json files from EarthEngine are parsed and cleanup in data/data_postprocessing.Rmd
.