Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ultra low resolution configuration for testing #2508

Open
danholdaway opened this issue Nov 21, 2024 · 52 comments
Open

Ultra low resolution configuration for testing #2508

danholdaway opened this issue Nov 21, 2024 · 52 comments
Assignees
Labels
enhancement New feature or request

Comments

@danholdaway
Copy link

Description

In order to test changes to the global-workflow system we have to run the cycled system. Currently this is done with a C48 atmosphere and 127 level / 5 degree ocean model configuration. However this is not fast enough for performing the testing and is limiting the number of pull requests that can be merged in global-workflow. Additionally the ensemble and deterministic forecasts are both run using C48, meaning we are not replicating the dual resolution setup that is used in production.

Note that the configuration does not have to produce anything scientifically sound, just be able to reliably run in order to test the connection between tasks of the workflow.

Solution

  • Develop a C12 and 32 level atmosphere with 10 degree ocean configuration for the ensemble.
  • Develop a C18 and 32 level atmosphere with 5 degree ocean configuration for the deterministic. If C18 is not a permissible configuration in FV3 then C24.

Note that the ocean can already by run at 5 degrees.

The data assimilation group at EMC will work on the enabling these configurations with GSI and JEDI.

@danholdaway danholdaway added the enhancement New feature or request label Nov 21, 2024
@JessicaMeixner-NOAA
Copy link
Collaborator

@danholdaway - Please let me know if ultra-low wave configurations are also required and I'm happy to help provide those parts. (Also fine to turn off waves for various tests too).

@danholdaway
Copy link
Author

Thank you @JessicaMeixner-NOAA I think it would be useful to include waves in this effort. We aren't currently testing with waves turned on but we should start to include that.

@junwang-noaa
Copy link
Collaborator

@danholdaway Our team will look into atm/ocn/ice components for the test you requested. May I ask some background questions:

  1. For atmosphere, is the choice of C12/C18/C24 (800km/500km/400km resolution) coming from the running speed only or are there some other reasons? Do you have test cases with 32 vertical levels for us as a configuration reference? I am thinking that we may have to use a different atm physics package at this coarse resolution from those used for GFS/GEFS/SFS, and the dycore needs to be hydrostatic, which is different from GFS/GEFS.
  2. For ocean, do you know if there is 10 degree super grid available? I assume you are OK with 5 degree ocean/ice if we can't find a 10 degree super grid
  3. How long do you expect these tests to run? I assume these tests only need to run 15hrs with IAU. There might be some coupling configuration change if longer running time is expected. Thanks

@danholdaway
Copy link
Author

Thanks for looking into this @junwang-noaa, we very much appreciate the help.

  1. I had a quick play around with JEDI and found that I couldn't create a C18 geometry with FV3 so that might be out of the question. The choice of C12/C24 is driven by running speed. For most of the PRs we are creating we don't want to test what the model does but rather test the connections between the tasks of the workflow. We need the model to run from one cycle to the next and output files for the DA to pick up. If we change the model configuration, e.g. physics or non-hydro -> hydro it should be fine though we might need to be aware that we might be reading less from the files, e.g. no delz to pick up. We don't have any configuration for 32 levels yet. We thought about starting with the 127 level file and taking ~every 4th level. Do you think that would work?

  2. @guillaumevernieres do you know if there exist any 10degree grids for MOM6? I think you mentioned you'd testing something at that resolution in SOCA.

  3. We don't need long forecasts with the coupled model. Just long enough to cycle. I think that's just 12h with IAU.

@guillaumevernieres
Copy link

@danholdaway , we do have a 10deg grid for mom6. Something would need to be done for cice6 as well.

@DeniseWorthen
Copy link
Collaborator

We create the CICE6 fix files from the MOM6 supergrid and mask. We'd need a MOM_input consistent w/ a 10deg MOM6 physics.

@junwang-noaa
Copy link
Collaborator

@danholdaway Thanks for the information.
@guillaumevernieres Would you please share the 10deg mom6 super grid and run configuration with us?

@guillaumevernieres
Copy link

@danholdaway Thanks for the information. @guillaumevernieres Would you please share the 10deg mom6 super grid and run configuration with us?

Let me see what we have. We did that work ages ago and I don't remember if we went beyond being able to use it within soca/jedi.

@guillaumevernieres
Copy link

@DeniseWorthen , @junwang-noaa , here's link to the jcsda soca repo with hopefully all the needed bits and pieces:
https://github.com/JCSDA/soca/tree/develop/test/Data/36x17x25

@DeniseWorthen
Copy link
Collaborator

To utilized the cpld_gridgen utility, we need a ocean_hgrid.nc file, a topography file and a mask file for MOM6. The grid_spec file you point to seems to already be at the target resolution (17x36x35).

I see that MOM_input is asking for the super grid file GRID_FILE = "ocean_hgrid_small.nc and topography file ocean_topog_small.nc. Those are the files we'd need. Do you have those somewhere?

@guillaumevernieres
Copy link

To utilized the cpld_gridgen utility, we need a ocean_hgrid.nc file, a topography file and a mask file for MOM6. The grid_spec file you point to seems to already be at the target resolution (17x36x35).

I see that MOM_input is asking for the super grid file GRID_FILE = "ocean_hgrid_small.nc and topography file ocean_topog_small.nc. Those are the files we'd need. Do you have those somewhere?

yes, sorry, I thought these were in one of the sub-dir but apparently not. I'll get back to you when I find these files.

@guillaumevernieres
Copy link

@DeniseWorthen , the files are on MSU:

/work/noaa/da/gvernier/mom6-10geg/36x17x25

do you need the MOM.res.nc restart as well?

@junwang-noaa
Copy link
Collaborator

@yangfanglin do you have a physics package for the ultra low resolutions (~800km/400km) that we can use to set up the C12/C24 with 32 vertical level tests? Thanks

@DeniseWorthen
Copy link
Collaborator

@DeniseWorthen , the files are on MSU:

/work/noaa/da/gvernier/mom6-10geg/36x17x25

do you need the MOM.res.nc restart as well?

I don't need restarts to generate the fix files. Are you setting the land mask = 0 where depth =0 (unless you have a mask file somewhere).

@GeorgeGayno-NOAA
Copy link
Contributor

Note: UFS_UTILS requires some slight updates to create low-resolution grids. This is being worked here:
ufs-community/UFS_UTILS#1000

@DeniseWorthen
Copy link
Collaborator

@guillaumevernieres I'm not making sense of the ocean_hgrid_small file. This should contain 2x the desired resolution, so for a final resolution of 36x17, it should be 72x34. I see 72x70 in the file.

@DeniseWorthen
Copy link
Collaborator

Establishing a 10-deg ocean resolution is going to play havoc w/ the naming convention and have knock-on impacts from fix file generation down through the regression test scripts and inputs etc. This is because currently we use a 3-character string to denote the ocean/ice resolution. For example, mx050 is 1/2 deg, mx100 is one deg etc. Creating a 10-deg resolution will require a 4 character string--so the 1/2 deg will need to be mx0050 and 5-deg will be mx0500 and 10-deg will be mx1000.

I wonder if a 9-deg ocean/ice resolution would be a better idea. That would be a 40x20 grid (vs 10deg = 36x17) but would avoid the issue with file naming and file regeneration, RT modifications etc.

@danholdaway
Copy link
Author

Thanks for reporting this @DeniseWorthen. Probably not worth changing the entire naming convention for this and I think 9 degree would suffice.

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Dec 4, 2024

I've been able to create a 9-deg configuration for ocean and produce the associated CICE grid files using the ufs-utils/cpld_gridgen utility. I'll also document this in the ufs-utils PR I'll open. I have not tried to run this yet, but I'll try w/ a DATM-OCN-ICE configuration next.

The ufs-utils repo has some of the require fre-nctools available to build, but not the make_topog tool, which is also required. I found the tools were installed system-wide on Gaea-C5, so I was able to use those to do the following:

  1. make the supergrid for 9deg
/ncrc/home2/fms/local/opt/fre-nctools/2024.04/ncrc5/bin/make_hgrid --grid_type tripolar_grid --nxbnd 2 --nybnd 2 --xbnd -280,80 --ybnd -85,90 --nlon 80 --nlat 40 --grid_name ocean_hgrid --center c_cell
  1. make the ocean mosaic
/ncrc/home2/fms/local/opt/fre-nctools/2024.04/ncrc5/bin/make_solo_mosaic --num_tiles 1 --dir ./ --mosaic_name ocean_mosaic --tile_file ocean_hgrid.nc --periodx 360
  1. make the ocean topog
/ncrc/home2/fms/local/opt/fre-nctools/2024.04/ncrc5/bin/make_topog --mosaic ocean_mosaic.nc --topog_type realistic --topog_file /ncrc/home2/fms/archive/mom4/input_data/OCCAM_p5degree.nc --topog_field TOPO --scale_factor=-1
  1. make the ocean mask
ncap2 -s 'maskdpt=0.0*depth;where(depth>=10.)maskdpt=1.0;tmask=float(maskdpt);tmask(0,:)=0.0' topog.nc ocean_mask.nc
ncks -O -x -v depth,maskdpt ocean_mask.nc ocean_mask.nc
ncrename -v tmask,mask ocean_mask.nc
ncatted -a standard_name,,d,, -a units,,d,, ocean_mask.nc

@guillaumevernieres

@danholdaway
Copy link
Author

Great progress, thank you @DeniseWorthen

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Dec 4, 2024

Well, by golly, it ran! I completed 24 hours using the DATM coupled to the 9deg MOM6/CICE6.

/work2/noaa/stmp/dworthen/CPLD_GRIDGEN/rt_349784/datm.mx900

@guillaumevernieres
Copy link

Well, by golly, it ran! I completed 24 hours using the DATM coupled to the 9deg MOM6/CICE6.

You rock @DeniseWorthen 🎉 🎉 Thanks for doing this!

@DeniseWorthen
Copy link
Collaborator

DeniseWorthen commented Dec 5, 2024

The next step is to generate the mapped ocean masks for the new ATM resolutions. These are used by chgres to create the oro-data and ICs. It sounds like you definitely want C12+9deg and C18/24+5deg, but there's some question of what ATM configuration (# levels, physics) will work?

George has created the required low-res C-grids, and I've created the mapped ocean masks (eg. C12.mx900.tile1.nc) on hercules here

/work2/noaa/stmp/dworthen/CPLD_GRIDGEN/rt_2310699

It seems like the best next step is to try to create the atm ICs and see if any of them run?

@LarissaReames-NOAA
Copy link
Collaborator

Yeah, my plan is to try using global_hyblev.l28.txt for vertical levels to create ATM ICs from existing chres_cube regression test input GFS data once we have C12.mx900 orography files. It looks like George was able to create C12mx100 files on Hera (/scratch1/NCEPDEV/da/George.Gayno/ufs_utils.git/UFS_UTILS/fix/orog.lowres) so hopefully the mx900 files shouldn't give us problems.

@DeniseWorthen
Copy link
Collaborator

OK, thanks. After generating the mapped ocean masks, I'm not well-versed on the process so I won't be much help.

@JessicaMeixner-NOAA
Copy link
Collaborator

For waves, I have created two girds: a 9 deg regular lat/lon grid and a 900km unstructured grid.

g-w fix files are here: /scratch1/NCEPDEV/climate/Jessica.Meixner/LowResWave/wave-gw-fix (other updates to workflow are needed as well)
and new WW3_input directory for ufs can be found here: /scratch1/NCEPDEV/climate/Jessica.Meixner/LowResWave/WW3_input_data_20241206

Please note I have not yet tested any of this, but wanted to share now

@GeorgeGayno-NOAA
Copy link
Contributor

The next step is to generate the mapped ocean masks for the new ATM resolutions. These are used by chgres to create the oro-data and ICs. It sounds like you definitely want C12+9deg and C18/24+5deg, but there's some question of what ATM configuration (# levels, physics) will work?

George has created the required low-res C-grids, and I've created the mapped ocean masks (eg. C12.mx900.tile1.nc) on hercules here

/work2/noaa/stmp/dworthen/CPLD_GRIDGEN/rt_2310699

It seems like the best next step is to try to create the atm ICs and see if any of them run?

Thanks @DeniseWorthen. I am guessing that you had to update the cpld_gridgen source code in order to generate these resolutions. Do you want to push those changes to 'develop'? If so, I can include them in my branch.

@DeniseWorthen
Copy link
Collaborator

@GeorgeGayno-NOAA What I did was to link in your "lowres" directories, otherwise I couldn't grab the ATM mosaic files that I needed. So it's a bit of a chicken-egg problem. Can we at this stage just add the ATM mosaic/grid files for all three (C12,18,24) to the fix files?

@LarissaReames-NOAA
Copy link
Collaborator

LarissaReames-NOAA commented Dec 10, 2024

Posted this in the wrong issue earlier, so moving it here:

Okay, I have a working ATM only configuration of the C12.mx500 forecast. I can produce a 24 hour forecast with 42 levels without issue. I based the physics options on the c48 v2 atm only test which uses CCPP suite FV3_GFS_v16, but modified to remove GWD scheme and with hydrostatic core/physics turned on. The forecast is on Hera here:

/scratch1/NCEPDEV/stmp2/Larissa.Reames/FV3_RT/rt_3367148/control_c12_intel

I can test the same configuration when we have C12.mx900 grid/fix files, which George has said he's going to move over to Hera from Hercules.

@DeniseWorthen
Copy link
Collaborator

@LarissaReames-NOAA I had to make a couple of changes to the MOM_input, ice_in etc to convert from 5deg->9deg. I had listed the run directory up above, but then renamed it , so it is now in /work2/noaa/stmp/dworthen/CPLD_GRIDGEN/test9deg/datm.mx900.

@GeorgeGayno-NOAA
Copy link
Contributor

@LarissaReames-NOAA I had to make a couple of changes to the MOM_input, ice_in etc to convert from 5deg->9deg. I had listed the run directory up above, but then renamed it , so it is now in /work2/noaa/stmp/dworthen/CPLD_GRIDGEN/test9deg/datm.mx900.

@DeniseWorthen - do these files need to be regenerated? /work2/noaa/stmp/dworthen/CPLD_GRIDGEN/rt_2310699

@DeniseWorthen
Copy link
Collaborator

@GeorgeGayno-NOAA Those files should be fine. My test case used a DATM but I need the various meshes and CICE fix files generated by cpld_gridgen for the 9deg resolution.

I'm happy to commit the changes to cpld_gridgen to generate these files, if we've settled on C12 and the mosaic and grid files can be added to the fix file directories.

@GeorgeGayno-NOAA
Copy link
Contributor

Created C12/C18 and C24 grids. They are on Hera.

ufs-community/UFS_UTILS#1000 (comment)

@DeniseWorthen
Copy link
Collaborator

@GeorgeGayno-NOAA Are we planning on supporting (in the sense of generating the fix files) all three super-low ATM resolutions?

@GeorgeGayno-NOAA
Copy link
Contributor

@GeorgeGayno-NOAA Are we planning on supporting (in the sense of generating the fix files) all three super-low ATM resolutions?

I don't know. That is a question for the group.

@LarissaReames-NOAA
Copy link
Collaborator

C12.mx900 and C24.mx500 ATM only 24-hr forecasts were both successful on Hera:
/scratch1/NCEPDEV/stmp2/Larissa.Reames/FV3_RT/rt_3367148/control_c12_intel
/scratch1/NCEPDEV/stmp2/Larissa.Reames/FV3_RT/rt_3367148/control_c24_intel

I think we're good to try with coupled configurations.

@danholdaway
Copy link
Author

Thank you @LarissaReames-NOAA and the rest of the team. How many model levels are in these configurations?

@LarissaReames-NOAA
Copy link
Collaborator

LarissaReames-NOAA commented Dec 11, 2024

Thank you @LarissaReames-NOAA and the rest of the team. How many model levels are in these configurations?

I'm currently using the npz=41 (levp=42) level file from fix/am . I'm open to trying the 28 level file if you think 42 is too much, but a 42 level forecast runs on one node with only 6 cores (aside from those used for write component) in a matter of seconds, so I'm not sure that going down to 28 will save all that much time.

@DeniseWorthen
Copy link
Collaborator

@LarissaReames-NOAA Would you like me to try building/running the coupled c12-9deg case?

@LarissaReames-NOAA
Copy link
Collaborator

@DeniseWorthen I think it's definitely worth a shot at this point.

@DeniseWorthen
Copy link
Collaborator

OK, I just didn't want to duplicate efforts.

@danholdaway
Copy link
Author

Thank you. I think 42 levels is perfectly fine if that's what's working now, plenty of speed up compared to 128.

@DeniseWorthen
Copy link
Collaborator

@LarissaReames-NOAA Can you build a coupled SDF to go w/ your FV3_C12?

@LarissaReames-NOAA
Copy link
Collaborator

@DeniseWorthen I'm no expert with ccpp suites, but I looked at a few examples in the suites directory to compare similarly named files with one coupled and the other not and it looks like the only difference is adding in "sfc_cice" before "sfc_land" in the surface loop. I've done that and it compiles fine with this suite definition file:
/scratch1/BMC/gsd-fv3/Larissa.Reames/ufs-weather-model-develop/FV3/ccpp/suites/suite_FV3_coupled_lowres.xml

@DeniseWorthen
Copy link
Collaborator

I'm building the input-data directories needed for the coupled tests. A couple of questions...

  1. can I use the sfc,gfs data files from Larissa's stand alone c12 control case for the coupled?
  2. do I need the oro_data_ss and oro_data_ls files?

@LarissaReames-NOAA
Copy link
Collaborator

I'm building the input-data directories needed for the coupled tests. A couple of questions...

  1. can I use the sfc,gfs data files from Larissa's stand alone c12 control case for the coupled?

Yes. I built them with the coupled grid files, so they should work.

  1. do I need the oro_data_ss and oro_data_ls files?

These are for the GWD scheme, so they should be unnecessary.

@DeniseWorthen
Copy link
Collaborator

@LarissaReames-NOAA Great, that is what I thought, but wanted to ask...

@DeniseWorthen
Copy link
Collaborator

I have an initial cpld case which starts up, but then dies at

0: NOTE from PE     0: reading topographic/orographic information from INPUT/oro_data.tile*.nc
 0:
 0: FATAL from PE     0: Error with opening file INPUT/oro_data_ss.nc

so I know I haven't gotten the input.nml set correctly. I'm trying to use global_control.nml.IN.

My branch is https://github.com/DeniseWorthen/ufs-weather-model/tree/feature/ultralow if anyone wants to take a stab at exporting all the right values in the cpld_control_c12 test.

@DeniseWorthen
Copy link
Collaborator

@LarissaReames-NOAA I've updated my ultralow feature branch to the top of develop. This contains a test (cpld_control_c12) where I was trying to reconcile the input.nml settings, and an rt.ultra to run the test.

The inputdata is staged in a special directory on Hera (INPUTDATA_ROOT=/scratch2/NCEPDEV/stmp3/Denise.Worthen/input-data-20240501). You should be able to write to that directory if you find any incorrect input data.

@LarissaReames-NOAA
Copy link
Collaborator

@LarissaReames-NOAA I've updated my ultralow feature branch to the top of develop. This contains a test (cpld_control_c12) where I was trying to reconcile the input.nml settings, and an rt.ultra to run the test.

The inputdata is staged in a special directory on Hera (INPUTDATA_ROOT=/scratch2/NCEPDEV/stmp3/Denise.Worthen/input-data-20240501). You should be able to write to that directory if you find any incorrect input data.

Thanks for getting the ICs staged. It looks like the mx900 CICE ICs are missing? I think they should be in /scratch2/NCEPDEV/stmp3/Denise.Worthen/input-data-20240501/CICE_IC/900/2021032206/ but the only thing in there is another directory, '900', with invalid.nc in there. Is the RT workflow pointing to the wrong place for your directory structure?

@DeniseWorthen
Copy link
Collaborator

@LarissaReames-NOAA I don't have an IC to use for CICE, so I just put a random file there for now, so that the RT scripts would work and we could get it running. In the cpld_control_c12, I set export CICE_ICE_IC=default. This will let CICE run w/ an initial ice concentration & thickness specified.

@LarissaReames-NOAA
Copy link
Collaborator

Okay, I have a working coupled C12 test. I pushed the test (cpld_control_c12) and the settings changes needed to my branch feature/lowres. There's also a lowres_rt.conf to run the test. I created a new namelist based on the namelist I used in the original ATM tests to make this work.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: In Progress
Development

No branches or pull requests

7 participants