Aus400

(Redirected from HighResLAM)

Aus400 was a project to run a regional ACCESS model at 400m resolution over the whole of Australia, as part of NCI's STRESS2020 test of Gadi

NCI Project Description
ia89 Aus400 Published Dataset
ly62 STRESS2020 CLEX Compute

Scripts and patches used for the model run are available at https://github.com/coecms/highreslam

HighResLam clouds.png.png

Experiment Details

There are two LAM levels being actively run, at 2.2km and 400m resolution. External boundary conditions come from BARRA, the BoM regional reanalysis

The two nest levels are run as individual Rose suites

  • Mid level domain: u-bm651
  • Inner domain: u-bq574

The period of interest for this experiment is the period around the landfall of TC Debbie, from 20170327T0000Z to 20170329T000Z. The 2.2 km domain has 24 hours of spinup (from 20170326T0000Z), the 400m domain has 12 hours of spinup (from 20170326T1200Z).

400m Domain

The domain is 13194 x 10554 grid points, of size 0.0036 degrees square, centred at 27.8S 133.26E

The domain boundaries work out to [109.5108, 157.0056] longitude in degrees_east, [-46.7972, -8.8064] latitude in degrees_north

On Gadi, the run used 11904 CPU cores (A decomposition of 30 x 32, with 32 additional MPI processes, and 12 OpenMP threads per MPI process). 1 model hour of simulation took around 2 hours 40 minutes of walltime, and the model used about 42 TB of memory.

Data Access

Output from Aus400 is published at NCI http://dx.doi.org/10.25914/5f1e4e3e3de2e

Full details are available in the file '/g/data/ia89/aus400/README.md'

The data structured as follows:

  • catalogue.csv: A listing of all published netCDF4 files
  • variables.csv: Metadata for each of the published variables
  • grids/: Grid and regridding information for converting from 400m to 2.2km and the BARRA domain
  • u-bq574/: 400m output data, paths like u-bq574/$RES/$STREAM/$VAR/$RES.$STREAM.$VAR.$DATE.nc
  • u-bm651/: 2.2km output data, paths like u-bm651/$RES/$STREAM/$VAR/$RES.$STREAM.$VAR.$DATE.nc
  • u-bs365/: 2.2km ensemble output data, paths like u-bs365/$RES/$START/$STREAM/$VAR/$RES.$STREAM.$VAR.$DATE.nc
  • restarts/: Restart information and boundary conditions

Variables in the output paths are

  • RES - 'd0198' for 0.0198 degree (2.2km), 'd0036' for 0.0036 degree (400m)
  • START - Experiment start time
  • STREAM - BARRA stream names
    • fx: static variables
    • cldrad: cloud and radar variables on model levels
    • mdl: model level variables
    • slv: single level variables @ 1 hour
    • spec: single level variables @ 10min
  • VAR - BARRA variable names

A Python/Jupyter cookbook for working with the Aus400 data is available at https://github.com/coecms/aus400-cookbook, contributions of analysis notebooks or functions are welcome.

Accessing from Gadi/VDI

On NCI systems the data is available under /g/data/ia89/aus400. Note you will need to join the ia89 project to access this data (this is simply to track usage, there are no requirements to join other than having NCI access).

Accessing remotely

Data is available remotely via OpenDAP through NCI's THREDDS catalogue http://dapds00.nci.org.au/thredds/catalogs/ia89/catalog.html

Citing Aus400

BibTex citation:

@misc{aus400,
    author = {Scott Wales and Chun-Hsu Su and Charmaine Franklin and Christian Jacob and Martin Jucker and Todd Lane and Clare Vincent},
    title = {400m resolution ACCESS limited-area simulation over Australia},
    year = {2020},
    doi = {10.25914/5f1e4e3e3de2e},
    howpublished= {\url{http://dx.doi.org/10.25914/5f1e4e3e3de2e}}
}

Visualisations

Model Modifications

Full Rose changes from Nesting Suite u-bi769 for 400m run

  • Due to the large domain size it was not possible to generate ancillary files for the full 400 m domain with standard tooling. Instead the files were generated in four quadrants, then the quadrants were combined into single files covering the full domain using Mule
  • Due to memory and walltime limitations the spiral search used in the UM reconfiguration to give data to newly resolved coastal grid points was performed offline, by saving the input values to a file, running the existing algorithm on that file using a larger number of CPUs, then re-running the reconfiguration reading from the processed files instead of performing the spiral search online.
  • There were some errors in the reconfiguration when gathering a full field from the individual MPI ranks, which resulted in artefacts at the MPI domain boundaries for some fields
  • Model orography was generated from SRTM data processed using ANTS' ancil_orographic_wavedrag.py, rather than the default IDL based script from the Nesting Suite. Orography over PNG was smoothed with an additional 3km Raymond filter before interpolation to the model grid. The orography field was inserted directly into the initial conditions file rather than using the reconfiguration to avoid the boundary errors described above.
  • Maximum field size within the IO server messages was increased to be able to fit the model grid.