-
Notifications
You must be signed in to change notification settings - Fork 0
License
BoldingBruggeman/pdaf-temp
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
PDAF (Parallel Data Assimilation Framework) Copyright 2004-2020, Lars Nerger, Alfred Wegener Institute Helmholz Center for Polar and Marine Research, Bremerhaven, Germany. For license information, please see the file LICENSE.txt. PDAF was written by Lars Nerger ([email protected]) For full documentation and tutorial, see: http://pdaf.awi.de [email protected] Introduction --------------------------------------------------------- PDAF is a framework for data assimilation. PDAF can be used to assess data assimilation methods with small models, to perform real data assimilation with high-dimensional models, and to teach ensemble data assimilation. PDAF provides - a parallel infrastructure, using MPI and OpenMP, to implement a parallel data assimilation system based on existing numerical models (typically of components of the Earth system). - a selection of sequential data assimilation algorithms based on the Kalman filter or nonlinear filters. - functions for ensemble diagnostics - functionality to generate synthetic observations for data assimilation studies (e.g. OSSEs) The PDAF release provides also - toy models fully implemented with PDAF for the study of data assimilation methods. - model bindings for using PDAF with different models The filter algorithms in PDAF are: ___ filters with global analysis step ___ - EnKF (The classical perturbed-observations Ensemble Kalman filter) [G. Evensen, J. Geophys. Res. 99 C5 (1994) 10143-10162, G. Burgers et al., Mon. Wea. Rev. 126 (1998) 1719-1724] - ESTKF (Error Subspace Transform Kalman filter) [L. Nerger et al. Mon. Wea. Rev. 140 (2012) 2335-2345, doi:10.1175/MWR-D-11-00102.1] - ETKF (Ensemble Transform Kalman filter) [C. H. Bishop et al. Mon. Wea. Rev. 129 (2001) 420-436] The implementation in PDAF follows that described for the LETKF, but as a global filter. - SEIK (Singular "Evolutive" Interpolated Kalman) filter This is the full ensemble variant of the SEIK (Singular "Interpolated" Extended Kalman) filter. [SEIK: D.-T. Pham et al., C. R. Acad Sci., Ser. III, 326 (1009) 255-260, for the SEIK variant in PDAF see L. Nerger et al., Tellus 57A (2005) 715-735] - SEEK (Singular "Evolutive" Extended Kalman) filter [D.-T. Pham et al., J. Mar. Syst. 16 (1998) 323-340] - NETF (Nonlinear Ensemble Transform Filter) [J. Toedter, B. Ahrens, Mon. Wea. Rev. 143 (2015) 1346] - PF (Particle filter with importance resampling) [see S. Vetra-Carvalho et al., Tellus A 70 (2018) 1445364] ___ filters with localized analysis step ___ - LNETF (Nonlinear Ensemble Transform Filter with observation localization) [J. Toedter, B. Ahrens, Mon. Wea. Rev. 143 (2015) 1346] - LESTKF (Local Error Subspace Transform Kalman filter) [L. Nerger et al. Mon. Wea. Rev. 140 (2012) 2335-2345, doi:10.1175/MWR-D-11-00102.1] - LETKF (Local Ensemble Transform Kalman filter) [B. R. Hunt et al., Physica D 230 (2007) 112-126] - LSEIK (Local Singular "Evolutive" Interpolated Kalman) filter [L. Nerger et al., Oce. Dyn. 56 (2006) 634-649] - LEnKF (The classical perturbed-observations Ensemble Kalman filter with localization) [G. Evensen, J. Geophys. Res. 99 C5 (1994) 10143-10162, G. Burgers et al., Mon. Wea. Rev. 126 (1998) 1719-1724] - LNETF (Nonlinear Ensemble Transform Filter with observation localization) [J. Toedter, B. Ahrens, Mon. Wea. Rev. 143 (2015) 1346] All filter algorithms are fully parallelized with MPI and optimized. The local filters (LSEIK, LETKF, LESTKF, LNETF) are in addition parallelized using OpenMP. Smoother extensions are included for the filters ESTKF/LESTKF, ETKF/LETKF, EnKF, NETF/LNETF. PDAF is written in Fortran (mainly Fortran 90 with some features from Fortran 2003). The compilation and execution has been tested on the different systems ranging from notenook computers to supercomputers, e.g.: - Linux - MacOS X - Cray CLE - NEC Super-UX - Microsoft Windows 10 with Cygwin - AIX (until 2015) - IRIX (long time ago) - Solaris (until ~2013) Tutorial ----------------------------------------------------------------- A good starting point for using PDAF is to study the tutorial implementations. The directory /tutorial contains files for a tutorial demonstrating the implementation of an offline analysis step with a simple 2-dimensional example as well as examples for an online implementation with a serial (non-parallelized) as well as a parallel model. The web site of PDAF (http://pdaf.awi.de) holds the corresponding tutorial slide sets in PDF format. The slides explain the implementation steps and how to compile and run the examples. The directory also contains a README file with instructions. Models -------------------------------------------------------------------- The directory models/ contains toy models that are fully implemented with PDAF. These models can be used to assess the behavior of different assimilation algorithms. - lorenz96 This directory contains the Lorenz-96 model, which is a widely used model to assess data assimilation methods. Provided is a full implementation of the data assimilation with various filters and options. This model can be configured to have a sufficiently large state dimension to test low-rank filter algorithms. - lorenz63 This directory contains the Lorenz-63 model, which is a classical 3-variable model with chaotic dynamics. Provided is a full implementation of the data assimilation with various filters and options. The small state dimension and nonlinear dynamics make it a suitable test case for the standard particle filter (PF). Instructions on how to use these models are provided on the PDAF web site. Installation of the PDAF library ------------------------------------------ The PDAF library will be automatically built when compiling a tutorial case or one of the models. However, one can also separately build the library. In order to build the PDAF library you need a Fortran 90 compiler, and 'make' 1. Choose a suitable include file for the make process and/or edit one. See the directory make.arch/ for several provided include files. There are include files for compilation with and without MPI. Note: PDAF is generally intended for parallel computing using MPI. However, it can be compiled for serial computing. To compile PDAF for this case, a simplified MPI header file is included und should be in the include path. In addition, a dummy implementation of MPI, which behaves like MPI in the single-process case, is provided in the directory nullmpi/. For the serial case, this file should also be compiled and linked when PDAF is linked to a program. 2. Set the environment variable $PDAF_ARCH to the name of the include file (without ending .h). Alternatively you can specify PDAF_ARCH on the command line when running 'make' in step 3. 3. cd into the directory src/ and type 'make' at the prompt. This will compile the sources and generate a library file in the directory lib/ Test suite ----------------------------------------------------------- The directory testsuite/ contains another set of example implementations. This is more for 'internal use'. We use these implementations to validate PDAF. - dummymodel_1D This is here the example implementation in which PDAF is fully connected to a model. The model is trivial: At each time step simply the time step size is added to the state vector. In this example all available filters are implemented. - offline_1D This example shows the usage of PDAF as an offline tool. In the offline configuration one computes manually the ensemble integrations and supplies this information to PDAF through files. This example does not use files, but generates dummy-information in the code. 1. To build the examples follow steps 1 and 2 from above. Subsequently cd to testsuite/src. 2. executing 'make' will show you the possible options for building. The executable will be available in the directory testsuite/bin/ after bulding. You don't need to build the PDAF library manually as described before. This is also part of the compile process for the example cases of the test suite. The test cases can also be run as pure forward models. For the, one has to remove the preprocessor definition 'USE_PDAF' from the include file for the make process. Verifying your installation ----------------------------------------------- The tutorial implementations can be verified as follows: In the main tutorial directory you can run the script runtests.sh. This script will compile and run all tutorial implementations. Afterwards the outputs at the final time step are checked against reference outputs from the directory verification. You can also compare the output files like out.online_2D_parallelmodel with reference files. (Note: The script runtests.sh uses the generic compile definitions for Linux with the gfotran compiler. For other systems, you might need to change the settings for the make definitions files). The testsuite also provides a functionality for verification: Using 'make' one can run test cases for the verification which are compared to reference outputs provided in the sub-directories of the directory testsuite/tests_dummy1D for different computers and compilers. In particular the online case dummymodel_1D and the offline test offline_1D can be run. Scripts for serial (non-parallel) execution as well as example scripts for running parallel test jobs on computers with SLURM or PBS batch systems are provided. An installation of PDAF can be veryfied using the test suite as follows: 1. prepare the include file in make.arch 2. cd to testsuite/src 3. Build and execute the online experiments: 'make pdaf_dummy_online' and 'make test_pdaf_online > out.test_pdaf_online' 4. Build and execute the offline experiments: 'make pdaf_dummy_online' and 'make test_pdaf_offline > out.test_pdaf_offline' 6. Check the files out.test_pdaf_online and out.test_pdaf_offline At the end of the file, you see a list of Checks. Here the outputs are compared with reference outputs produced with gfortran and MacOS. You can also diff the files to corresponding files in one of the example-directories in ../tests_dummy1D. Here, also reference output files, like output_lestkf0.dat are stored. Contact Information ------------------------------------------------------- Please send comments, suggestions, or bug reports to [email protected]
About
No description, website, or topics provided.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published