-
Notifications
You must be signed in to change notification settings - Fork 250
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ultra low resolution configuration for testing #2508
Comments
@danholdaway - Please let me know if ultra-low wave configurations are also required and I'm happy to help provide those parts. (Also fine to turn off waves for various tests too). |
Thank you @JessicaMeixner-NOAA I think it would be useful to include waves in this effort. We aren't currently testing with waves turned on but we should start to include that. |
@danholdaway Our team will look into atm/ocn/ice components for the test you requested. May I ask some background questions:
|
Thanks for looking into this @junwang-noaa, we very much appreciate the help.
|
@danholdaway , we do have a 10deg grid for mom6. Something would need to be done for cice6 as well. |
We create the CICE6 fix files from the MOM6 supergrid and mask. We'd need a MOM_input consistent w/ a 10deg MOM6 physics. |
@danholdaway Thanks for the information. |
Let me see what we have. We did that work ages ago and I don't remember if we went beyond being able to use it within soca/jedi. |
@DeniseWorthen , @junwang-noaa , here's link to the jcsda soca repo with hopefully all the needed bits and pieces: |
To utilized the cpld_gridgen utility, we need a ocean_hgrid.nc file, a topography file and a mask file for MOM6. The grid_spec file you point to seems to already be at the target resolution (17x36x35). I see that MOM_input is asking for the super grid file |
yes, sorry, I thought these were in one of the sub-dir but apparently not. I'll get back to you when I find these files. |
@DeniseWorthen , the files are on MSU:
do you need the MOM.res.nc restart as well? |
@yangfanglin do you have a physics package for the ultra low resolutions (~800km/400km) that we can use to set up the C12/C24 with 32 vertical level tests? Thanks |
I don't need restarts to generate the fix files. Are you setting the land mask = 0 where depth =0 (unless you have a mask file somewhere). |
Note: UFS_UTILS requires some slight updates to create low-resolution grids. This is being worked here: |
@guillaumevernieres I'm not making sense of the |
Establishing a 10-deg ocean resolution is going to play havoc w/ the naming convention and have knock-on impacts from fix file generation down through the regression test scripts and inputs etc. This is because currently we use a 3-character string to denote the ocean/ice resolution. For example, mx050 is 1/2 deg, mx100 is one deg etc. Creating a 10-deg resolution will require a 4 character string--so the 1/2 deg will need to be mx0050 and 5-deg will be mx0500 and 10-deg will be mx1000. I wonder if a 9-deg ocean/ice resolution would be a better idea. That would be a 40x20 grid (vs 10deg = 36x17) but would avoid the issue with file naming and file regeneration, RT modifications etc. |
Thanks for reporting this @DeniseWorthen. Probably not worth changing the entire naming convention for this and I think 9 degree would suffice. |
I've been able to create a 9-deg configuration for ocean and produce the associated CICE grid files using the ufs-utils/cpld_gridgen utility. I'll also document this in the ufs-utils PR I'll open. I have not tried to run this yet, but I'll try w/ a DATM-OCN-ICE configuration next. The ufs-utils repo has some of the require fre-nctools available to build, but not the make_topog tool, which is also required. I found the tools were installed system-wide on Gaea-C5, so I was able to use those to do the following:
|
Great progress, thank you @DeniseWorthen |
Well, by golly, it ran! I completed 24 hours using the DATM coupled to the 9deg MOM6/CICE6. /work2/noaa/stmp/dworthen/CPLD_GRIDGEN/rt_349784/datm.mx900 |
You rock @DeniseWorthen 🎉 🎉 Thanks for doing this! |
The next step is to generate the mapped ocean masks for the new ATM resolutions. These are used by chgres to create the oro-data and ICs. It sounds like you definitely want C12+9deg and C18/24+5deg, but there's some question of what ATM configuration (# levels, physics) will work? George has created the required low-res C-grids, and I've created the mapped ocean masks (eg.
It seems like the best next step is to try to create the atm ICs and see if any of them run? |
Yeah, my plan is to try using global_hyblev.l28.txt for vertical levels to create ATM ICs from existing chres_cube regression test input GFS data once we have C12.mx900 orography files. It looks like George was able to create C12mx100 files on Hera ( |
OK, thanks. After generating the mapped ocean masks, I'm not well-versed on the process so I won't be much help. |
For waves, I have created two girds: a 9 deg regular lat/lon grid and a 900km unstructured grid. g-w fix files are here: /scratch1/NCEPDEV/climate/Jessica.Meixner/LowResWave/wave-gw-fix (other updates to workflow are needed as well) Please note I have not yet tested any of this, but wanted to share now |
Thanks @DeniseWorthen. I am guessing that you had to update the |
@GeorgeGayno-NOAA What I did was to link in your "lowres" directories, otherwise I couldn't grab the ATM mosaic files that I needed. So it's a bit of a chicken-egg problem. Can we at this stage just add the ATM mosaic/grid files for all three (C12,18,24) to the fix files? |
Posted this in the wrong issue earlier, so moving it here: Okay, I have a working ATM only configuration of the C12.mx500 forecast. I can produce a 24 hour forecast with 42 levels without issue. I based the physics options on the c48 v2 atm only test which uses CCPP suite FV3_GFS_v16, but modified to remove GWD scheme and with hydrostatic core/physics turned on. The forecast is on Hera here: /scratch1/NCEPDEV/stmp2/Larissa.Reames/FV3_RT/rt_3367148/control_c12_intel I can test the same configuration when we have C12.mx900 grid/fix files, which George has said he's going to move over to Hera from Hercules. |
@LarissaReames-NOAA I had to make a couple of changes to the MOM_input, ice_in etc to convert from 5deg->9deg. I had listed the run directory up above, but then renamed it , so it is now in /work2/noaa/stmp/dworthen/CPLD_GRIDGEN/test9deg/datm.mx900. |
@DeniseWorthen - do these files need to be regenerated? |
@GeorgeGayno-NOAA Those files should be fine. My test case used a DATM but I need the various meshes and CICE fix files generated by I'm happy to commit the changes to |
Created C12/C18 and C24 grids. They are on Hera. |
@GeorgeGayno-NOAA Are we planning on supporting (in the sense of generating the fix files) all three super-low ATM resolutions? |
I don't know. That is a question for the group. |
C12.mx900 and C24.mx500 ATM only 24-hr forecasts were both successful on Hera: I think we're good to try with coupled configurations. |
Thank you @LarissaReames-NOAA and the rest of the team. How many model levels are in these configurations? |
I'm currently using the npz=41 (levp=42) level file from fix/am . I'm open to trying the 28 level file if you think 42 is too much, but a 42 level forecast runs on one node with only 6 cores (aside from those used for write component) in a matter of seconds, so I'm not sure that going down to 28 will save all that much time. |
@LarissaReames-NOAA Would you like me to try building/running the coupled c12-9deg case? |
@DeniseWorthen I think it's definitely worth a shot at this point. |
OK, I just didn't want to duplicate efforts. |
Thank you. I think 42 levels is perfectly fine if that's what's working now, plenty of speed up compared to 128. |
@LarissaReames-NOAA Can you build a coupled SDF to go w/ your |
@DeniseWorthen I'm no expert with ccpp suites, but I looked at a few examples in the suites directory to compare similarly named files with one coupled and the other not and it looks like the only difference is adding in "sfc_cice" before "sfc_land" in the surface loop. I've done that and it compiles fine with this suite definition file: |
I'm building the input-data directories needed for the coupled tests. A couple of questions...
|
Yes. I built them with the coupled grid files, so they should work.
These are for the GWD scheme, so they should be unnecessary. |
@LarissaReames-NOAA Great, that is what I thought, but wanted to ask... |
I have an initial cpld case which starts up, but then dies at
so I know I haven't gotten the My branch is https://github.com/DeniseWorthen/ufs-weather-model/tree/feature/ultralow if anyone wants to take a stab at exporting all the right values in the |
@LarissaReames-NOAA I've updated my ultralow feature branch to the top of develop. This contains a test ( The inputdata is staged in a special directory on Hera ( |
Thanks for getting the ICs staged. It looks like the mx900 CICE ICs are missing? I think they should be in /scratch2/NCEPDEV/stmp3/Denise.Worthen/input-data-20240501/CICE_IC/900/2021032206/ but the only thing in there is another directory, '900', with invalid.nc in there. Is the RT workflow pointing to the wrong place for your directory structure? |
@LarissaReames-NOAA I don't have an IC to use for CICE, so I just put a random file there for now, so that the RT scripts would work and we could get it running. In the cpld_control_c12, I set |
Okay, I have a working coupled C12 test. I pushed the test (cpld_control_c12) and the settings changes needed to my branch feature/lowres. There's also a lowres_rt.conf to run the test. I created a new namelist based on the namelist I used in the original ATM tests to make this work. |
Description
In order to test changes to the global-workflow system we have to run the cycled system. Currently this is done with a C48 atmosphere and 127 level / 5 degree ocean model configuration. However this is not fast enough for performing the testing and is limiting the number of pull requests that can be merged in global-workflow. Additionally the ensemble and deterministic forecasts are both run using C48, meaning we are not replicating the dual resolution setup that is used in production.
Note that the configuration does not have to produce anything scientifically sound, just be able to reliably run in order to test the connection between tasks of the workflow.
Solution
Note that the ocean can already by run at 5 degrees.
The data assimilation group at EMC will work on the enabling these configurations with GSI and JEDI.
The text was updated successfully, but these errors were encountered: