DYAMOND Winter#

Simulation Period:

20 January 2020 - 1 March 2020

with 10 day spin-up peripode and 30 days analysis periode

Field Experiment:

align with the EUREC4a field study.

Experiment:

Global atmosphere-only and coupled atmosphere-ocean models with a storm resolving grid spacing of 5 km or less. Detailed information can be found in the DYAMOND Winter protocol (pdf).

Initital data is provided at the ESiWACE webpage.

Completion of the simulation is still ongoing.

Participating models and description#

We are in the process of standardizing the contributions we received so far. THIS MEANS THAT FILE NAMES CAN CHANGE AT ANY TIME WITHOUT PRIOR WARNING. Please have a look at the overview of the processed variables for variable names and data availability.

Grid information

For several models, the grid information is not included in the output. Instead it can be found in a grid.nc file. You can find this grid file in the frequency directory fx/ of the directory structure of the data, e.g. /work/bk1040/DYAMOND/data/winter_data/DYAMOND_WINTER/NOAA/SHiELD-3km/DW-ATM/atmos/fx/gn/grid.nc for the SHiELD data.

To associate data with the grid information, cdo will need a -setgrid,GRIDFILENAME, e.g.

cdo -sellonlatbox,0,20,40,60 -setgrid,GRIDFILENAME INFILE OUTFILE

Please note that because of disk space constraints we will start offloading data to our tape archive soon. We provide scripts for requesting data to be downloaded into the library as needed. Please find more information regarding this at The DYAMOND Data Library.

Working with the data#

You are most invited to use our systems for analyzing the data (instead of copying them around). Our post-processing project 1153 has some compute resources that can be used for analysis scripts. Therefore, we recommend to store project data under /work/bb1153/<user-id>/ and use the temporary storage in your scratch folder /scratch/*/<user-id> for processing of large data sets. For more information about the Levantes file systhem its quotas and backups, see The file systems of Levante (docs.dkrz.de).

For model and run descriptions see Participating models and description above.

Using DKRZs JupyterHub and intake-esm#

DKRZ provides a JupytherHub server to analyse the big data sets stored at DKRZ by using Python, R or Julia scripts running directly on Levante.

All post-processed DYAMOND Winter data sets are included in the intake-esm catalog of the DYAMOND and nextGEMS project: /work/ka1081/Catalogs/dyamond-nextgems.json. Intake-esm is a Python library that allows you to easily access data from a variety of simulations in a consistant manner. Take a look our various Pythons scripts!

Direct file access#

All stadardized data sets can be found in The DYAMOND Data Library at /work/bk1040/DYAMOND/data/winter_data/ stored at the DKRZ’s Mistral supercomputer. The data sets are read-only, so you will have to define the respective subfolder as the source directory for your post-processing scripts.

Browse the data by using our intake-esm catalog#

All post-processed DYAMOND Winter data sets are included in the intake-esm catalog of the DYAMOND and nextGEMS project: /work/ka1081/Catalogs/dyamond-nextgems.json.

Browse the data by using Freva and gems.dkrz.de#

Important

This service is currently under construction and therefore not available.

For an overview of the processed files, please have a look at the GEMS Freva webfrontend (use your mistral ID or the “guest” button to log in). To browse the files in the shell, you can use the freva command line interface (note that this currently is in alpha stage as we are moving things in the background):

export MODULEPATH=$MODULEPATH:/work/ka1081/gems.modules/
module load gems/alpha

# to get an overview of the data available
freva --databrowser project=dyamond --all-facets

# use tab-completion
freva --databrowser project=dyamond experiment= TAB TAB

# all surface air temperature files for ifs-4km with 1hr time step
# (see possible options from --all-facets above)
freva --databrowser project=dyamond experiment=dw-cpl  model=ifs-4km variable=tas time_frequency=1hr

# call cdo on multiple files
cdo -sinfov  $(freva --databrowser project=dyamond experiment=dw-atm  model=arpege-nh-2km variable=rst)