Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Trigrid coupling with SE dycore, CAM standalone | Modifying ocn_in, drv_in, etc.

I was wondering if there was additional documentation on the trigrid coupling ability (introduced in CPL7) in CAM standalone framework.

Primarily, I'm interested in using the SE (cubed-sphere) dycore for the atmosphere and other standard grids for ocean (data), land, and ice. I'm currently operating with CAM 5.1.27.

Building a test case with a dycore such as FV and setting bndtvs, focndomain, fatmlndfrc, fsurdat works fine as long as I specify the same grid for the ocean/ice and the atmosphere. However, building with SE (HOMME) and using the same ocean/land ICs as in the FV case leads to an error in the CICE routine (ice_grid.f90) where it checks to see if the lat/lon dimensions are the same. Here, using a standard lat/lon ocean/ice grid and SE produces an error because ice_grid.f90 reads in the columns (ncol) from HOMME which produces a mismatch with any lat/lon grid selected.

I have gone in and manually played with drv_in* to try and set a few flags like "samegrid_ao" etc. to false, but this seems to have had no effect. I was originally concerned that I needed to operate solely in the CESM framework, but it appears that the source contained in CAM contains the ability for this multi-grid coupling (creating the weights files before run, etc.) My primary expierience with CAM has either been aquaplanet simulations or out-of-box production runs, so mixing grids with the ocean/ice and land components is a new frontier for me.

---

* As an addendum, is there anyway to specify custom namelist options from CAM (standalones) build-namelist utility for the ocean or land components? I know there is a -cice_nl flag to send a string to CICE, but I see nothing for a standard docn or CLM that can be done without hard modifying the _in files after configure.

Sorry for the novice questions.
 

hannay

Cecile Hannay
AMWG Liaison
Staff member
Beware that spectral element dycore hasn't been released yet. This is part of the trunk but we are still working on the configuration.

About your namelist question:

If you are using the CESM scripts.

- You can specify the CLM namelists change in: user_nl_clm
http://www.cesm.ucar.edu/models/ccsm4.0/clm/models/lnd/clm/doc/UsersGuide/x1327.html

- You can modify the ocean namelists in Buildconf/pop2.buildnml.csh in your are running coupled
- or in: Buildconf/docn.buildnml.csh if you are runnning CAM standalone with prescribed SSTs
 

eaton

CSEG and Liaisons
Currently the use of trigrid coupling is only supported from the
CESM scripts. CAM's configure has not been updated to support
building the non-atm components on a different grid from CAM, nor
has the build-namelist utility been updated to specify the
required mapping files. I anticipate having these capabilities
implemented by the time CAM is released with the SE dycore as an
officially supported option.

You are correct that the source tree of a CAM development tag
such as cam5_1_27 contains all the necessary machinery to run in
trigrid mode. It's just a matter of getting the configure and
build-namelist utilities to perform the same setup that's
currently done by the CESM scripts for an F compset running in a
trigrid mode.

When running CAM's build-namelist in the standalone mode it is
currently only possible to directly specify namelist varibles for
the CICE component (via the -cice_nl option). That's because
CAM's build-namelist calls the CICE build-namelist and just
passes this argument along. Namelists for the other non-atm
components are handled directly by CAM's build-namelist rather
than delegating responsibility to other components namelist
building utilities. You will find the namelist variables used to
control the behavior of other components listed in CAM's namelist
documentation. This architecture is just an artifact of the
historical development process. Current CESM development has
extended the use of build-namelist utilities to all the CESM
components. We'll be updating CAM's build-namelist to make
direct use of the non-atm build-namelist utilities.
 
Thank you Brian/Cecile.

That pretty much confirms my suspicions. I'll start migrating over to CESM soon.

If I could ask one more question-- who at CESM would be a good contact with regards to creating mapping files for the trigrid? To be clear, I'm looking to take new atmospheric grids (which I already have created and run CAM standalone with) to couple to existing ocean/ice/land grids.

I have found some stuff in the repo (/fs/cgd/csm/mapping on Bluefire for example), but the README is 5 years old and some documentation implies there are newer ESMF utilities available that might make the process a bit more streamlined, especially for unstructured grids like the cubed sphere (ex: I need a way to go from HOMME/SE grids to SCRIP before getting weights -- since there is more support for grids like ne30, ne240, etc. I'm hoping there is something that would make my life slightly easier that exists now!)

Thanks again for the help. I really appreciate it.
 
Top