Skip to content

Issues with using FMS initialization in soca Geometry #1186

@shlyaeva

Description

@shlyaeva

When debugging parallel recentering I discovered a couple issues with using FMS initialization in soca Geometry. Without knowing much about FMS, I suspect they all happen because FMS expects to be initialized once, and we are initializing it for every geometry created. Or, it may be possible to initialize FMS in parallel on different communicators, but we are not doing it correctly.

Issues:

  • during initialization input.nml is copied (happens in soca::FmsInput, so not exactly FMS issue) and warnfile.000000.out is created (on develop happens in an mpp_init call inside Fortran soca geometry). When several geometries are created in parallel (like when ensemble members are distributed e.g. Parallel ensemble recentering and postprocessing for GFS #1183) it can trigger race condition on creating/copying those files. Is it possible to not create warnfile? Or make sure names are different when run on different communicators?
  • FMS/MPP initializes some communicator-related variables. It is not possible currently to create in the same application two soca::Geometries on different-sized MPI communicators. E.g. one can't currently have an application that transposes ensemble of states from all-parallel to all-sequential, which may be useful in the future. is it possible to initialize in parallel FMS/MPP on different communicators, possibly different sizes, and continue using both of the setups in the same application?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions