Regional Tasmanian domain forced by GLORYS and ERA5 reanalysis datasets#
Note: FRE-NC tools are required to be set up, as outlined in the documentation of regional-mom6 package.
For this example we need:
This example reads in the entire global extent of ERA5 and GEBCO; we don’t need to worry about cutting it down to size.
What does the regional_mom6
package do?#
Setting up a regional model in MOM6 can be a pain. The goal of this package is that users should spend their debugging time fixing a model that’s running and doing weird things, rather than puzzling over a model that won’t even start.
In running this notebook, you’ll hopefully have a running MOM6 regional model. There will still be a lot of fiddling to do with the MOM_input
file to make sure that the parameters are set up right for your domain, and you might want to manually edit some of the input files. But, this package should help you bypass most of the woes of regridding, encoding and understanding the arcane arts of the MOM6 boundary segment files.
What does this notebook do?#
This notebook is designed to set you up with a working MOM6 regional configuration. First, try to get it running with our default Tasmania case, then you can clone the notebook and modify for your region of interest.
Input Type |
Source |
Subsets required |
---|---|---|
Surface |
Data from 2003; whole globe or subset around our domain |
|
Ocean |
Boundary segments & initial condition; see section 2 for details. |
|
Bathymetry |
whole globe or subset around domain |
[ ]:
import regional_mom6 as rmom6
import os
from pathlib import Path
from dask.distributed import Client
Start a dask client.
[ ]:
client = Client()
client
Step 1: Choose our domain, define workspace paths#
To make sure that things are working I’d recommend starting with the default example defined below. If this runs ok, then change to a domain of your choice and hopefully it runs ok too! If not, check the README and documentation for troubleshooting tips.
You can log in and use this GUI to find the lat/lon of your domain and copy paste below.
[ ]:
expt_name = "tasmania-example-reanalysis"
latitude_extent = [-48, -38.95]
longitude_extent = [143, 150]
date_range = ["2003-01-01 00:00:00", "2003-01-05 00:00:00"]
## Place where all your input files go
input_dir = Path(f"mom6_input_directories/{expt_name}/")
## Directory where you'll run the experiment from
run_dir = Path(f"mom6_run_directories/{expt_name}/")
## Directory where compiled FRE tools are located (needed for construction of mask tables)
toolpath_dir = Path("PATH_TO_FRE_TOOLS")
## Path to where your raw ocean forcing files are stored
glorys_path = Path("PATH_TO_GLORYS_DATA" )
## if directories don't exist, create them
for path in (run_dir, glorys_path, input_dir):
os.makedirs(str(path), exist_ok=True)
Step 2: Prepare ocean forcing data#
We need to cut out our ocean forcing. The package expects an initial condition and one time-dependent segment per non-land boundary. Naming convention is "east_unprocessed"
for segments and "ic_unprocessed"
for the initial condition.
Data can be downloaded directly from the Copernicus Marine data store via their GUI (once logged in).
Initial condition: Using the GUI, select an area matching your
longitude_extent
andlatitude_extent
that corresponds to the first day in your date range. Download the initial condition and save it with filenameic_unprocessed.nc
inside theglorys_path
directory.Boundary forcing: Using the GUI, select the Eastern boundary of your domain (if you have one that contains ocean). Allow for a buffer of ~0.5 degrees in all directions, and download for the prescribed
date_range
. Download and nameeast_unprocessed.nc
.Repeat step 2 for the remaining sections of the domain.
Step 3: Make experiment object#
The regional_mom6.experiment
contains the regional domain basics, and also generates the horizontal and vertical grids, hgrid
and vgrid
respectively, and sets up the directory structures.
[ ]:
expt = rmom6.experiment(
longitude_extent = longitude_extent,
latitude_extent = latitude_extent,
date_range = date_range,
resolution = 0.05,
number_vertical_layers = 75,
layer_thickness_ratio = 10,
depth = 4500,
mom_run_dir = run_dir,
mom_input_dir = input_dir,
toolpath_dir = toolpath_dir
)
We can now access the horizontal and vertical grid of the regional configuration via expt.hgrid
and expt.vgrid
respectively.
Plotting the vertical grid with marker = '.'
lets you see the spacing. You can use numpy.diff
to compute the vertical spacings, e.g.,
import numpy as np
np.diff(expt.vgrid.zl).plot(marker = '.')
shows you the vertical spacing profile.
Modular workflow!#
After constructing your expt
object, if you don’t like the default hgrid
and vgrid
you can simply modify and then save them back into the expt
object. However, you’ll then also need to save them to disk again. For example:
new_hgrid = xr.open_dataset(input_dir + "/hgrid.nc")
Modify new_hgrid
, ensuring that all metadata is retained to keep MOM6 happy. Then, save your changes
expt.hgrid = new_hgrid
expt.hgrid.to_netcdf(input_dir + "/hgrid.nc")
Step 4: Set up bathymetry#
Similarly to ocean forcing, we point the experiment’s setup_bathymetry
method at the location of the file of choice and also provide the variable names. We don’t need to preprocess the bathymetry since it is simply a two-dimensional field and is easier to deal with. Afterwards you can inspect expt.bathymetry
to have a look at the regional domain.
After running this cell, your input directory will contain other bathymetry-related things like the ocean mosaic and mask table too. The mask table defaults to a 10x10 layout and can be modified later.
[ ]:
expt.setup_bathymetry(
bathymetry_path='PATH_TO_GEBCO_FILE/GEBCO_2022.nc',
longitude_coordinate_name='lon',
latitude_coordinate_name='lat',
vertical_coordinate_name='elevation',
minimum_layers=1
)
Check out your domain:#
[8]:
expt.bathymetry.depth.plot()
[8]:
<matplotlib.collections.QuadMesh at 0x14ae191683a0>
Step 5: Handle the ocean forcing - where the magic happens#
This cuts out and interpolates the initial condition as well as all boundaries (unless you don’t pass it boundaries).
The dictionary maps the MOM6 variable names to what they’re called in your ocean input file. Notice how for GLORYS, the horizontal dimensions are latitude
and longitude
, vs xh
, yh
, xq
, yq
for MOM6. This is because for an ‘A’ grid type tracers share the grid with velocities so there’s no difference.
If one of your segments is land, you can delete its string from the ‘boundaries’ list. You’ll need to update MOM_input to reflect this though so it knows how many segments to look for, and their orientations.
[ ]:
# Define a mapping from the GLORYS variables and dimensions to the MOM6 ones
ocean_varnames = {"time": "time",
"yh": "latitude",
"xh": "longitude",
"zl": "depth",
"eta": "zos",
"u": "uo",
"v": "vo",
"tracers": {"salt": "so", "temp": "thetao"}
}
# Set up the initial condition
expt.initial_condition(
glorys_path / "ic_unprocessed.nc", # directory where the unprocessed initial condition is stored, as defined earlier
ocean_varnames,
arakawa_grid="A"
)
# Now iterate through our four boundaries
for i, orientation in enumerate(["south", "north", "west", "east"]):
expt.rectangular_boundary(
glorys_path / (orientation + "_unprocessed.nc"),
ocean_varnames,
orientation, # Needs to know the cardinal direction of the boundary
i + 1, # Just a number to identify the boundary. Indexes from 1
arakawa_grid="A"
)
Step 6: Run the FRE tools#
This is just a wrapper for the FRE tools needed to make the mosaics and masks for the experiment. The only thing you need to tell it is the processor layout. In this case we’re saying that we want a 10 by 10 grid of 100 processors.
[ ]:
expt.FRE_tools(layout=(10, 10)) ## Here the tuple defines the processor layout
Step 7: Set up ERA5 forcing:#
Here we assume the ERA5 dataset is stored somewhere on the system we are working on.
Below is a table showing ERA5 characteristics and what needs to be done to sort it out.
Required ERA5 data:
Name |
ERA5 filename |
ERA5 variable name |
Units |
---|---|---|---|
Surface Pressure |
sp |
sp |
Pa |
Surface Temperature |
2t |
t2m |
K |
Meridional Wind |
10v |
v10 |
m/s |
Zonal Wind |
10u |
u10 |
m/s |
Specific Humidity |
kg/kg, calculated from dewpoint temperature |
||
Dewpoint Temperature |
2d |
d2m |
K |
We calculate specific humidity \(q\) from dewpoint temperature \(T_d\) and surface pressure \(P\) via saturation vapour pressure \(P_v\).
[ ]:
expt.setup_era5("PATH_TO_ERA5_DATA/era5/single-levels/reanalysis")
Step 8: Modify the default input directory to make a (hopefully) runnable configuration out of the box#
This step copies the default directory and modifies the MOM_layout
files to match your experiment by inserting the right number of x, y points and CPU layout.
To run MOM6 using the payu infrastructure, provide the keyword argument using_payu = True
to the setup_run_directory
method and an example config.yaml
file will be appear in the run directory. The config.yaml
file needs to be modified manually to add the locations of executables, etc.
[ ]:
expt.setup_run_directory(surface_forcing = "era5")
Step 9: Run and Troubleshoot!#
To run the regional configuration first navigate to your run directory in terminal and use your favourite tool to run the experiment on your system.
Ideally, MOM6 runs. If not, the first thing you should try is reducing the timestep. You can do this by adding #override DT=XXXX
to your MOM_override
file.
If there’s strange behaviour on your boundaries, you could play around with the nudging timescale
(an example is already included in the MOM_override
file). Sometimes, if your boundary has a lot going on (like all of the eddies spinning off the western boundary currents or off the Antarctic Circumpolar current), it can be hard to avoid these edge effects. This is because the chaotic, submesoscale structures developed within the regional domain won’t match the flow at the boundary.
Another thing that can go wrong is little bays that create non-advective cells at your boundaries. Keep an eye out for tiny bays where one side is taken up by a boundary segment. You can either fill them in manually, or move your boundary slightly to avoid them
[ ]: