- 1 ACCESS-KPP Coupled Model
- 2 Building
- 3 Running
- 4 How it works
- 5 Performance
- 6 Reproducibility
- 7 Parallel KPP
- 8 Resources
- 9 Moving From Vayu
ACCESS-KPP Coupled Model
The ACCESS-KPP model couples an ACCESS atmospheric model with a KPP mixed-layer ocean model. The model was developed by Christine Chung at the Bureau of Meteorology based off work by Nicholas Klingaman at Reading, and ported to NCI by the CMS team.
The coupled model is currently available as the UMUI job sabjh
The UM job is built as normal using the UMUI. The code for the coupled model is in the branch | fcm:um_dev/saw562/christine-kpp. The coupled model uses the Oasis library installed to /g/data/access/apps/oasis3/svn, this is automatically loaded by the hand edits ~access/umui_jobs/hand_edits/kpp/oasis3.sh and ~access/umui_jobs/overrides/coupled/oasis3-module.ovr.
Compiling KPP requires the NetCDF and Oasis3 modules to be loaded, as well as version 1.6.5 of OpenMPI. To do this run
module use ~access/modules module swap openmpi/1.6.5 module load netcdf oasis3
before building it.
The KPP source is available in the UM repository under the KPP branch. To obtain the source run
svn co https://access-svn.nci.org.au/svn/um/branches/dev/saw562/christine-kpp/kpp
The file parameter.inc may need to be modified to fit the model resolution- see the variables NX, NY, NX_GLOBE & NY_GLOBE after #ifdef OASIS3. Once this has been set up the model can be built by running
If the KPP model has been built in a different location the environment variable $KPPBIN in the UMUI's Input/Output Control->Time Convention and Environment should be changed to point to the new location, this is what the run scripts use to find the ocean model.
ACCESS1.0-KPP uses an updated version of KPP coupled to the ACCESS 1.0 AMIP configuration
ACCESS1.0-KPP is built and run using job vatae
The KPP source is available in the UM repository. To obtain the source run
svn co https://access-svn.nci.org.au/svn/um/branches/dev/saw562/kpp-N48
To build the model run
module use ~access/modules module load openmpi/1.8.2 module load oasis3-mct-longrun/testing make oasis3_coupled
KPP will now read the run dates from the UM namelist files, so this does not require manual configuration. You will still need to set the KPP domain
The paths to necessary KPP variables are set in Input/Output Control->Time Convention and Environment Variables as the environment variables KPPANCIL, KPPSCRIPTS, KPPCONFIG, KPPBIN.
- KPPBIN: KPP executable
- KPPCONFIG: KPP configuration namelist (3D_ocn.nml)
- KPPSCRIPTS: Path to run scripts to set up the KPP job
- KPPANCIL: Path to KPP ancillary files
You may also need to edit the Oasis namcouple config file, which is set in Sub-Model Configurations->OASIS Coupling Switches.
The job can be processed and submitted as a run job as normal, starting up oasis is handled by the run scripts.
Changing Run Length
Changing the run length requires modifying the settings both in the UMUI and the KPP configuration file 3D_ocn.nml. The default location of this is vayu:/data/projects/access/ancil/kpp/oceanancil/3D_ocn.nml, make a copy of this into your own directory and change the $KPPCONFIG environment variable to point to it. You will need to alter the 'finalt' variable in this file, which is in units of days.
Changing the Ocean Domain
The KPP model needs information on both the atmosphere and ocean grids in order to change its domain. The grid sizes themselves are hardcoded in the KPP model, to change them you will need to edit the variables in the file parameter.inc. Check out the source from subversion, enter that directory & change the include file, then build KPP following the instructions above.
The variables NX and NY describe the size of the ocean grid. You will need to change the values in the OASIS3 section on line 10. The variables NX_GLOBE and NY_GLOBE describe the atmosphere model. They are already set up for an N48 run on line 59. Note that if you change the domains you may need new ancillary files.
The rest of the variables are loaded at runtime from the namelist file, changing these does not require the KPP model to be rebuilt. Make a copy of /short/projects/access/ancil/kpp/oceanancil/3D_ocn.nml, change the variables defined there then set the UM environment variable KPP_CONFIG (Input/Output Control->Time Convention and Environment Variables in the UMUI) to point to the new version.
- alon and alat are a latitude, longitude pair describing the co-ordinate of the southwest corner of the ocean grid. delta_lon and delta_lat describe how large a grid cell is in degrees. It is possible for the KPP model to use irregular grid spacing, however this isn't used when it's coupled to the UM.
- ifirst, ilast, jfirst and jlast describe the area on the atmosphere grid to couple to. These values are gridpoint co-ordinates, not map co-ordinates. In most cases the coupled region should match the size of the ocean grid, with the grid point at ifirst, jfirst being at co-ordinate alon, alat.
Above is a diagram of how the variables relate to the grids. The pink area is the coupled region, the size of which is defined by the first and last variables. If the coupled region doesn't match the ocean grid size then it is placed in the south-west of the ocean grid. Make sure that the latitude values of both grids match.
CRUNS work as normal, the KPP model will be started from a restart file. The file KPPocean.hist dictates the start and end time of a KPP run, for a CRUN KPP will load KPP.restart.$start_time. No modifications need to be made to the KPP config file, setting the correct start date is handled by the run scripts.
Coupled STASH fields
Fields from the ocean model are assigned the following STASH codes:
How it works
The job uses a modified run script ($KPPSCRIPTS/qsexecute) to run the UM. This run script copies the KPP ancillaries to the run directory & creates symbolic links oasis3, toyoce and um7.3x to the component executables (oasis restricts how things are named). It then runs the job as a multiple-program MPI process, with Oasis passing fields back and forth between the models.
These files should be in the KPPANCIL directory. Their names are set by the KPP namelist file, they are copied to the run directory by the setup_hadgem3 script in KPPSCRIPTS:
- Initial conditions: initcond.nc
- Land-sea mask: lsm_ocndepth.nc
- Initial fluxes: kpp_initfluxes.nc
- Prescribed SST and ice fields for outside the simulation bound: sst_clim.nc & ice_clim.nc
- Atmosphere at 30 min before simulation start: a2o.nc
These files are required to set up Oasis & its interpolations
- Grid point coordinates grids.nc
- Land-sea mask masks.nc
- Grid cell area areas.nc
Fields are exchanged every 3 hours, with a half hour lag in the Atmosphere->Ocean direction (At 03:00 the ocean gets the atmosphere fields as they were at 02:30).
Fields exchanged are: Ocean -> Atmosphere
- OCN_SST: Sea surface temperature in K
- OFRZN01: Sea ice area fraction
- OSNWTN01: Surface snow amount in kg/m^2
- OHICN01: Ice depth (non-cf)
- SUNOCEAN: Surface grid eastward sea water velocity in m/s
- SVNOCEAN: Surface grid northward sea water velocity in m/s
Atmosphere -> Ocean
- HEATFLUX: Surface downward heat flux in W/m^2
- SOLAR: Surface net downward shortwave heat flux in W/m^2
- RUNOFF: Water flux into ocean from rivers in kg/sm^2 (not used by KPP)
- WME: Surface energy flux into ocean due to wind mixing in W/m^2 (not used by KPP)
- TRAIN: Rainfall flux in kg/sm^2
- TSNOW: Snow fall flux in kg/sm^2 (not used by KPP)
- EVAP2D: Water evaporation flux
- LHFLX: Surface downward latent heat flux in W/m^2 (not used by KPP)
- TMLT01: Ice top melt (non-cf)(not used by KPP)
- BMLT01: Ice bottom melt (non-cf)(not used by KPP)
- TAUX: Surface downward grid eastward stress in Pa
- TAUY: Surface downward grid northward stress in Pa
Titles refer to UM decomposition + number of extra processors used for the coupler and ocean model. Theoretically this would be 2 additional processors, but because of the queue configuration at NCI only entire nodes can be requested when using more than 8 processors.
64x64 + 8 / 30 Days
- Wall Time: 00:29:57
- CPU Time: 32:28:17
- SU: 35.94
- Memory usage: 32 GB
- Output directory size: 16GB
- Reproducible running an identical job twice? Yes
- Reproducible across different decompositions? Yes
The CoE is enhancing the KPP model so that it can be run in parallel, allowing for faster runtimes with global oceans.
The parallel version of KPP is in a separate branch, at fcm:um_dev/saw562/christine-kpp-raijin-parallel. It can be built by following the same instructions as for the serial version.
The parallel version works by adding a synchronization step at each model timestep, as well as after Oasis 'gets'. Since each grid point in the model is independent of all the others this can be done fairly simply using MPI gather operations (some complexity is added because the KPP grid layout is not ideal for this). The synchronization is handled in 'gather_fields.f90'.
Only the root KPP rank communicates with the coupler, all other KPP ranks ignore Oasis.
KPP Documentation (relates to a slightly older version of the model)
Moving From Vayu
- Change the Target Machine to "raijin"
- Add the hand edits:
- Disable the hand edit "~access/vayu.sh"
- Add the FCM User File override "~access/umui_jobs/overrides/coupled/atm-pointers-noopt.ovr" (in Compilation -> FCM User Override Files)
- Change the FCM options for Atmosphere to:
- URL = fcm:um_dev/saw562/christine-kpp-raijin
- BIND = fcm:um_dev/Share/VN7.3_local_changes/src/configs/bindings
- CONTAINER = $UM_SVN_BIND/container.cfg@HEAD
- Compile KPP on Raijin (instructions above)
- Set the KPP_BIN variable to point to your new KPP_ocean