Skip to content

Conversation

@uturuncoglu
Copy link
Contributor

Description

This PR aims to bring CDEPS Inline capability to FV3 to update desired fields.

Issue(s) addressed

N/A

Testing

How were these changes tested?
What compilers / HPCs was it tested with?
Are the changes covered by regression tests? (If not, why? Do new tests need to be added?)
Have the ufs-weather-model regression test been run? On what platform?

  • Will the code updates change regression test baseline? If yes, why? Please show the baseline directory below.
  • Please commit the regression test log files in your ufs-weather-model branch

We have a limited test with SRW HRRR configuration to update SST and other sea ice related variables over Great Lakes.

Dependencies

If testing this branch requires non-default branches in other repositories, list them.
Those branches should have matching names (ideally)

Do PRs in upstream repositories need to be merged first?
If so add the "waiting for other repos" label and list the upstream PRs

@uturuncoglu
Copy link
Contributor Author

Note that the implementation is not working after - aa14843. The data provided by CDEPS can not be passed correctly. Probably the approach used in fv3/atmos_model.F90 needs to be fixed.

@DusanJovic-NOAA
Copy link
Collaborator

@uturuncoglu Can you check the lower/upper bounds of datar82d array in setup_inlinedata:

This loop:

          do j = jsc, jec
             do i = isc, iec
                nb = Atm_block%blkno(i,j)
                ix = Atm_block%ixp(i,j)
                im = GFS_control%chunk_begin(nb)+ix-1
                GFS_Coupling%fice_dat(im) = datar82d(i,j)
             end do
          end do

assumes the indexes of datar82d are global. But when the ESMF_Field holding this pointer is created:

https://github.com/NOAA-EMC/fv3atm/blob/3145f5b45e244a0949b7082d8bb3057df269b619/cpl/module_inline.F90#L155

indexflag=ESMF_INDEX_DELOCAL is used. Try to use ESMF_INDEX_GLOBAL.

@uturuncoglu
Copy link
Contributor Author

@DusanJovic-NOAA Let me try. Thanks.

@uturuncoglu
Copy link
Contributor Author

uturuncoglu commented Jul 11, 2025

@DusanJovic-NOAA JFYI, I am passing grid component to CDEPS inline with following call

call stream_init(fcstGridComp(cpl_grid_id), clock, rc)

and grid for fcstGridComp is defined like following in the FV3/fv3/module_fcst_grid_comp.F90.

      grid = ESMF_GridCreateCubedSphere(tileSize=tilesize, &
                                        coordSys=ESMF_COORDSYS_SPH_RAD, & 
                                        coordTypeKind=grid_typekind, &
                                        regDecompPTile=decomptile, &
                                        decompflagPTile=decompflagPTile, &
                                        name="fcst_grid", rc=rc)

As I know that the default value for the index flag is ESMF_INDEX_DELOCAL in here. So, if I set ESMF_INDEX_GLOBAL, I am getting following error,

20250711 155429.969 ERROR            PET158 ESMF_FieldEmpty.F90:19545 ESMF_FieldEmptyCompGB Argument sizes do not match  - - user specified indexflag must be identical with Grid indexflag
20250711 155429.970 ERROR            PET158 ESMF_FieldCreate.F90:3232 ESMF_FieldCreateGBData Argument sizes do not match  - Internal subroutine call returned Error
20250711 155429.970 ERROR            PET158 ESMF_FieldCreate.F90:9567 ESMF_FieldCreateGridData Argument sizes do not match  - Internal subroutine call returned Error

The fields needs to be created with ESMF_INDEX_DELOCAL to be consistent with grid definition.

@uturuncoglu
Copy link
Contributor Author

The datar82d indexes are local in my case. So, I updated datar82d(i,j) to datar82d(i-isc+1,j-jsc+1) to make the indexes also local. Then, run my configuration again but did not helped too much.

This is what CDEPS provides in terms of mask - stored in datar82d

Screenshot 2025-07-11 at 4 32 42 PM

and following is from FV3 output file phyf000.nc which is in CCPP side (using CCPP debugging I added it to the FV3 history file),

Screenshot 2025-07-11 at 4 32 26 PM

It seems that block positions are fine but maybe there is an issue with im = GFS_control%chunk_begin(nb)+ix-1 calculation. Any idea?

@uturuncoglu
Copy link
Contributor Author

@DusanJovic-NOAA I also think that the data in CCPP side seems rotated or reversed in one direction. Not sure.

@DeniseWorthen
Copy link
Collaborator

@uturuncoglu I'm going to guess that you've tried this in debug mode?

@uturuncoglu
Copy link
Contributor Author

@grantfirl Thanks for the extra information. That is really helpful. The confusion was in my end since it was working even with horizontal_loop_extent before and failed after sync with the model. Anyway, it is all solved now.

@grantfirl
Copy link
Collaborator

@uturuncoglu You need to merge in the latest develop please.

@uturuncoglu
Copy link
Contributor Author

@grantfirl let me sync

@gspetro-NOAA
Copy link

Could we get reviews on this sub-PR so that we can process WM parent PR 2807?

@gspetro-NOAA
Copy link

gspetro-NOAA commented Jan 15, 2026

@NickSzapiro-NOAA Could you review this PR (or tag someone who would be better if you're not the right person)?

@BrianCurtis-NOAA
Copy link
Collaborator

ccpp-physics now at ufs-community/ccpp-physics@55886ae. Please update hash and revert .gitmodules. Once this is completed it can be merged.

@grantfirl
Copy link
Collaborator

@uturuncoglu The ccpp-physics PR has been merged. Please revert .gitmodules and update the ccpp-physics hash to ufs-community/ccpp-physics@55886ae

@uturuncoglu
Copy link
Contributor Author

@BrianCurtis-NOAA @grantfirl Okay. I updated UFSATM and point official CCPP physics repo and also point hash.

@BrianCurtis-NOAA
Copy link
Collaborator

looks good, please merge when ready.

@DusanJovic-NOAA DusanJovic-NOAA merged commit 35e9c99 into NOAA-EMC:develop Jan 16, 2026
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

7 participants