Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Changing the time step size for SCAM Experiment

r_a_khan

Ramsha Khan
New Member
Hi,

I have been running SCAM experiments using one of the provided IOPs, and a default time step size of 1200s (EUL, 64 x 128 grid). I would like to now change the resolution of the time step, and am unsure which is the correct variable to change. I am running experiments via CESM2.1.0, and according to the namelist definitions,

dtime:
The length (in seconds) of the atm time step, i.e., the driver calls the
atm component once every dtime seconds. This is also the coupling interval
between the dynamics and physics packages. This variable is not actually
used in the atm model, but rather is used by build-namelist to set the
value of atm_cpl_dt. So it will have an effect only
when running CAM using standalone scripts. The CESM scripts have their own
method for setting atm_cpl_dt.

So, as I am running using CESM2.1.0 and not a standalone CAM model, is this the incorrect variable to adjust? Looking up the variable 'atm_cpl_dt', I find in the CESM caseroot definitions:

ATM_NCPL:
Number of atm coupling intervals per NCPL_BASE_PERIOD.
This is used to set the driver namelist atm_cpl_dt, equal to basedt/ATM_NCPL,
where basedt is equal to NCPL_BASE_PERIOD in seconds.

So my question is, should I change dtime via the namelist, or should I adjust ATM_NCPL via xmlchange?

Thanks,
Ramsha
 
Top