Simulation Period:

1 August 2016 - 10 September 2016,

with 10 day spin-up peripode and 30 days analysis periode

Field Experiment:

NAR-VAL2 tropical field study


Global atmosphere models with a storm resolving grid spacing of 5 km or less. Detailed information can be found at the DYAMOND Summer protocol(pdf).

Initital data are provided at the ESiWACE webpage.

Completion of the simulation had been done until May 2018.

Participating models and description

Working with the data

You are most invited to use our systems for analyzing the data (instead of copying them around). Our post-processing project 1153 has some compute resources that can be used for analysis scripts. Therefore, we recommend to store project data under /work/bb1153/<user-id>/ and use the temporary storage in your scratch folder /scratch/*/<user-id> for processing of large data sets. For more information about the Levantes file systhem its quotas and backups, see The file systems of Levante (docs.dkrz.de).

For model and run descriptions see Participating models and description above.

Direct file access

DYAMOND Summer data sets can be found in the The DYAMOND Data Library at /work/bk1040/DYAMOND/data/summer_data/ stored at the DKRZ’s Mistral supercomputer. The data sets are read-only, so you will have to define the respective subfolder as the source directory for your post-processing scripts.

DKRZ provides a JupytherHub server to analyse the big data sets stored at DKRZ by using Python, R or Julia scripts running directly on Levante.

Intersting pages from our Hints for processing data section might be:

Hints and Help

  • data file description can be found in datadescription.txt within each model folder within the DYAMOND data library

  • some models provide further information within a README file, which can be found in the model folder within the DYAMOND data library

  • example processing scripts can be found within the DYAMOND data library at /work/bk1040/DYAMOND/scripts/

  • slides of the 2nd DYAMOND Hackathon, June 2019 (at the bottom of the page)

  • take a look the wiki page of the MPI-M