Scripts and patches used for the model run are available at https://github.com/coecms/highreslam
There are two LAM levels being actively run, at 2.2km and 400m resolution. External boundary conditions come from BARRA, the BoM regional reanalysis
The two nest levels are run as individual Rose suites
- Mid level domain: u-bm651
- Inner domain: u-bq574
The period of interest for this experiment is the period around the landfall of TC Debbie, from 20170327T0000Z to 20170329T000Z. The 2.2 km domain has 24 hours of spinup (from 20170326T0000Z), the 400m domain has 12 hours of spinup (from 20170326T1200Z).
The domain is 13194 x 10554 grid points, of size 0.0036 degrees square, centred at 27.8S 133.26E
The domain boundaries work out to [109.5108, 157.0056] longitude in degrees_east, [-46.7972, -8.8064] latitude in degrees_north
On Gadi, the run used 11904 CPU cores (A decomposition of 30 x 32, with 32 additional MPI processes, and 12 OpenMP threads per MPI process). 1 model hour of simulation took around 2 hours 40 minutes of walltime, and the model used about 42 TB of memory.
Temporary processed outputs for testing purpose are available at /g/data/w35/saw562/HighResLAM/output
- Pre-visualisation movie of cloud cover during 2.2 km domain spinup
- Poster image - Full resolution 1 pixel = 1 grid point, 400m surface types, 2.2km clouds at 3pm AEST 27 March 2017
Input data and raw outputs are available at /g/data/ua8/HighResLAM
- Due to the large domain size it was not possible to generate ancillary files for the full 400 m domain with standard tooling. Instead the files were generated in four quadrants, then the quadrants were combined into single files covering the full domain using Mule
- Due to memory and walltime limitations the spiral search used in the UM reconfiguration to give data to newly resolved coastal grid points was performed offline, by saving the input values to a file, running the existing algorithm on that file using a larger number of CPUs, then re-running the reconfiguration reading from the processed files instead of performing the spiral search online.
- There were some errors in the reconfiguration when gathering a full field from the individual MPI ranks, which resulted in artefacts at the MPI domain boundaries for some fields
- Model orography was generated from SRTM data processed using ANTS' ancil_orographic_wavedrag.py, rather than the default IDL based script from the Nesting Suite. Orography over PNG was smoothed with an additional 3km Raymond filter before interpolation to the model grid. The orography field was inserted directly into the initial conditions file rather than using the reconfiguration to avoid the boundary errors described above.
- Maximum field size within the IO server messages was increased to be able to fit the model grid.