Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions EN4_postprocessing/merge_mean_surface_crps.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ def merge_by_time(season="all"):
with ProgressBar():
save_path = config.dn_out + "profiles/"
ds_index.to_netcdf(save_path + "%03s_CRPS_merged.nc"%(season))
print(f'File written to {config.dn_out+"%03s_CRPS_merged.nc"%(season)}')
print(f'File written to {save_path+"%03s_CRPS_merged.nc"%(season)}')

def regional_means():
""" mean over each region """
Expand All @@ -49,8 +49,8 @@ def regional_means():
fn_dom_nemo = "%s%s"%(config.dn_dom, config.grid_nc)
fn_cfg_nemo = config.fn_cfg_nemo
fn_cfg_prof = config.fn_cfg_prof
fn_analysis_crps = "%s%03s_CRPS_merged.nc"%(config.dn_out, season)
fn_out = "%s%03s_mask_means_crps_daily.nc"%(config.dn_out, season)
fn_analysis_crps = "%s%03s_CRPS_merged.nc"%(config.dn_out+"profiles/", season)
fn_out = "%s%03s_mask_means_crps_daily.nc"%(config.dn_out+"profiles/", season)

# get the CRPS data as a profile object
crps = coast.Profile(config=fn_cfg_prof)
Expand Down
9 changes: 4 additions & 5 deletions EN4_postprocessing/plot_surface_crps.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,10 +27,9 @@
# on each axis of the plot


# Assuming loading two configs: co7 and the P0.0. HARD WIRING. NOT IDEAL
fn_list = [config.dn_out+"All_mask_means_crps_daily.nc",
config.dn_out.replace(config.case,'co7')+"All_mask_means_crps_daily_co7.nc"]#

# Comparison case defined in config.
fn_list = [config.dn_out+"profiles/"+All_mask_means_crps_daily.nc",
config.comp_case["proc_data"]+"profiles/+"All_mask_means_crps_daily.nc"]#
#%% General Plot Settings
## Specify the specific regions, their labels and order of appearance to plot. Region indexing to match EN4_postprocessing mean_season.py
region_ind = [ 1, 7, 3, 2, 9, 5, 4, 6, 8] # Region indices (in analysis) to plot
Expand Down Expand Up @@ -101,7 +100,7 @@
var_name = "profile_mean_{0}_crps".format(var_str.lower()) # profile_mean_temperature_crps

# Filename for the output
fn_out = "FIGS/regional_{0}_{1}_{2}.pdf".format(var_name, config.case, 'co7')
fn_out = "FIGS/regional_{0}_{1}_{2}.pdf".format(var_name, config.case, config.comp_case["case"])

# Create plot and flatten axis array
f,a = plt.subplots(n_r, n_c, figsize = figsize, sharey = sharey)
Expand Down
11 changes: 11 additions & 0 deletions EN4_processing/spice_regional_masking.slurm
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
#!/bin/bash
#SBATCH --mem=40000
#SBATCH -o LOGS/%A_%a.out
#SBATCH -e LOGS/%A_%a.err
#SBATCH --time=180
#SBATCH --ntasks=1
source config.sh
source activate $CONDA_ENV

echo "python regional_masking.py "
python regional_masking.py $1 > LOGS/regional_masking.log
9 changes: 8 additions & 1 deletion PythonEnvCfg/spice_config.sh
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ export STARTYEAR=2013 #1980 # 2004
export ENDYEAR=2013 #1980 # 2014

## Process monthly data. Required in iter_sub_METEST.sh
export MOD="P1.5c" # Model reference name
export MOD="P2" # Model reference name
export GRID="domain_cfg_sf12.nc" # contains the grid information for NEMO
# options?: "domain_cfg_MEs_01-003_opt_v1.nc" # "GEG_SF12.nc"
# options?: "GEG_SF12.nc" CO7_EXACT_CFG_FILE.nc
Expand Down Expand Up @@ -42,3 +42,10 @@ export DN_DOM="/data/users/o.lambertbrown/Datasets/models/CO9/u-cu674/"
#fn_dat = "/scratch/fred/COMPARE_VN36_VN_4.0_TIDE_SSH/%s/DAILY/%s%02d*T.nc*"%(exper,startyear,month)
export DN_DAT="/data/users/o.lambertbrown/Datasets/models/CO9/u-cu674/"
export DN_OUT="/data/users/o.lambertbrown/Datasets/models/CO9/u-cu674/analysis/"

# Directories for comparison with another model/experiment. These are used when plotting/postprocessing.
export COMP_MOD="Cray-Ex"
export COMP_DAT="/data/scratch/o.lambertbrown/u-do888/"
export COMP_GRID="domain_cfg_sf12.nc"
export COMP_OUT="/data/users/o.lambertbrown/Datasets/models/CO9/u-do888/analysis/"

51 changes: 27 additions & 24 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ python map_profiles.py $1 $2 > LOGS/OUT_$1_$2.log
```
using arguments: $1 $2 corresponding to the above.

This outputs, in `DN_OUT/profiles/`, files like:
This outputs, in `DN_OUT/profiles/` (be sure to make a DN_OUT/profiles directory), files like:
```
extracted_profiles_200401.nc
interpolated_profiles_200401.nc
Expand All @@ -105,7 +105,6 @@ profile_errors_200401.nc
surface_data_200401.nc
mid_data_200401.nc
bottom_data_200401.nc
mask_means_daily_200401.nc

```

Expand All @@ -122,12 +121,12 @@ cd ../EN4_processing

rm LOGS/OUT* LOGS/*.err LOGS/*.out

#sbatch -J 201407 --time=2:00:00 lotus_ana_MOD_METEST.sh 2014 7
#sbatch -J 201010 --time=2:00:00 lotus_ana_MOD_METEST.sh 2010 10
#sbatch -J 201011 --time=2:00:00 lotus_ana_MOD_METEST.sh 2010 11
sbatch -J 201109 --time=3:00:00 lotus_ana_MOD_METEST.sh 2011 9
#sbatch -J 201110 --time=2:00:00 lotus_ana_MOD_METEST.sh 2011 10
sbatch -J 200905 --time=3:00:00 lotus_ana_MOD_METEST.sh 2009 5
#sbatch -J 201407 --time=2:00:00 lotus_map_profiles.sh.sh 2014 7
#sbatch -J 201010 --time=2:00:00 lotus_map_profiles.sh.sh 2010 10
#sbatch -J 201011 --time=2:00:00 lotus_map_profiles.sh.sh 2010 11
sbatch -J 201109 --time=3:00:00 lotus_map_profiles.sh.sh 2011 9
#sbatch -J 201110 --time=2:00:00 lotus_map_profiles.sh.sh 2011 10
sbatch -J 200905 --time=3:00:00 lotus_map_profiles.sh.sh 2009 5
```

2. `PythonEnvCfg/<MACHINE>_config.sh` must both be edited for machine choices, conda environment, paths etc.
Expand All @@ -138,43 +137,42 @@ Merge seasons (DJF, MAM, JJA, SON) from multiple years into single files.

Execute with:
```
iter_merge_season.sh
iter_extract_season.sh
```
which is just a simple concatenating loop over each season.

Each month invokes a machine specific sbatch scripts (e.g `spice_merge_season.sbatch`) where the model and season are
Each month invokes a machine specific sbatch scripts (e.g `spice_extract_season.sbatch`) where the model and season are
passed onto a generic script
`python merge_season.py $1 $2 #1=Model, 2=month`
`python extract_season.py $1 $2 #1=Model, 2=month`

Outputs are written to DN_OUT by season string, sss:
Outputs are written to DN_OUT/profiles by season string, sss:
```
sss_PRO_INDEX.nc ## merging interpolated_profiles_*.nc (model profiles on ref levels)
sss_PRO_DIFF.nc ## merging profile_errors_*.nc (diff between model & obs on ref levels)
sss_PRO_OBS.nc ## merging interpolated_obs_*.nc (obs on ref levels)
```

### Create Means

Then call `iter_mean_season.sh` to compute the spatial means over subregions within the NWS domain.
Then call `regional_masking.sh` to compute the spatial means over subregions within the NWS domain.

This launches machine specific script

`sbatch ${MACHINE,,}_mean_season.sbatch $MOD $month`
`sbatch ${MOD}_regional_mask ${MACHINE,,}_regional_masking.slurm`
that in turn launches a machine independent script:
```
python mean_season.py $1 $2 > LOGS/mean_season_$1_$2.log # 1=Model, 2=month
python regional_masking.py $1 > LOGS/regional_masking.log # 1=Model, 2=month
```

to compute averages in each of the defined regions:
This reads in sss_PRO_INDEX.nc and ss_PRO_DIFF.nc to compute averages in each of the defined regions:
```
region_names = [ 'N. North Sea','S. North Sea','Eng. Channel','Outer Shelf', 'Irish Sea',
'Kattegat', 'Nor. Trench', 'FSC', 'Off-shelf']
```
Creating output:
Outputs are written to DN_OUT/profiles as:
```
DJF_mask_means_daily.nc
MAM_mask_means_daily.nc
JJA_mask_means_daily.nc
SON_mask_means_daily.nc
profiles_by_region_and_season.nc ### Model profiles
profile_bias_by_region_and_season.nc ### Difference between model and obs
```

### CRPS values
Expand All @@ -187,16 +185,16 @@ Execute: `. ./iter_surface_crps.sh`
This deploy monthly processes on ${MACHINE} (currently only tested on JASMIN's lotus)

```
sbatch "${MACHINE,,}"_surface_crps.sh $MOD $start $month $end $GRID"
sbatch "${MACHINE,,}"_surface_crps.sh $start $month"
```
which in turn launches the python script

```
python surface_crps.py $1 $2 $3 $4 $5
python surface_crps.py $1 $2
```

following the appropriate header commands for the batch scheduler.
Output files take the form: `surface_crps_data_p0_201101_2012.nc`
Output files are saved to DN_OUT/profiles and take the form: `surface_crps_data_p0_201101_2012.nc`

Next merge and compute regional averages. E.g. merge_mean_surface_crps.py in EN4_postprocessing.

Expand Down Expand Up @@ -247,6 +245,11 @@ which submits the following machine independent script
```
python merge_mean_surface_crps.py $1 > LOGS/merge_mean_surface_crps_$1.log
```
This script will merge all CRPS data into one file and then average over each region. Outputs are written to DN_OUT/profiles:
```
All_CRPS_merged.nc
All_mask_means_crps_daily.nc
```

Finally the plots can be made with
```
Expand Down