This site is migrating to a new forum software on Tuesday, September 24th 2019, you may experience a short downtime during this transition

Main menu


[porting] CAM error growth test

4 posts / 0 new
Last post
[porting] CAM error growth test

I was wondering if somebody could advise us on the following. We are currently porting CESM 1.1.2 to a machine in Holland (machine: Cartesius) to run some long-term coupled experiments. In order to verify the port we've compared the model output with a run from Yellowstone. In this test we observed significant differences in the atmosphere already in the first month. Just to be sure we are looking at normal weather and not a broken installation of CESM I decided to do some more testing on the different components.

In the CESM user guide it is recommended to "Perform validation (both functional and scientific):

a. Perform a CAM error growth test
b. Follow the CCSM4.0 CICE port-validation procedure.
c. Follow the CCSM4.0 POP2 port-validation procedure.”

For the first point (a) the user is referred to the page

Following the steps on this page and after some sweating I was able to come up with the requested RMS values for the temperature. However, they do not follow quite the same pattern as the perturbation experiment on the website, see attached picture. The RMS difference observed in my tests ranges from 3.6e-3 for the first sample to 6.1e-2 for the last sample (2 days).

There are a couple of questions I would like to ask regarding the validation test and the figure.

1. Why is the configure command ran with the "-phys cam4" parameter whereas the validation test itself clearly concerns CAM5?
2. Is the suggested 1e-11 RMS difference at all realistic for this test, or should we aim for lower values like ours?
3. Already from the first sample the RMS difference is large compared to the reference experiment, could this be related to file I/O or different initialisation? I have used both the CAM source code that came with CESM 1.1.2 and that of 1.2.1.

Any help would be greatly appreciated. Thanks.


Sorry but the CAM error growth page that you looked at was a bit out of
date.  I have done some updating and have supplied new control runs that
are compatible with the cesm1.0.6 and cesm1.2.2 code bases.  These are the
currently supported release versions.

The error growth test results that you posted are indeed examples of failed
tests.  Although the specific parameterizations that comprise the cam4
physics package have not changed, there have possibly been bug fixes,
parameter changes, or changes in initial or boundary datasets which render
your cesm1.1.2 and cesm1.2.1 runs incompatible with the cam5.0 (cesm1.0)
control that was linked to the port page.  I believe the control for
cesm1.2.2 will be compatible with the cesm1.2.1 run that you've done.  If
that test doesn't pass then please let us know.

Replies to your specific questions:

1) There is some unfortunate confusion by our use of CAM to describe both a
model framework and a physics package.  The CAM5 model is capable of
running the cam3, cam4, and cam5 physics packages (as well as adiabatic and
idealized (Held-Suarez) physics).

A related point here is that the error growth test does not work with the
cam5 physics package.  We still find the test using cam4 physics to be
useful as part of a general procedure which validates simpler model
configurations before doing an expensive full climate simulation.  But
currently the only option for validating the cam5 physics package is by
doing a full climate simulation.  We are working on a project which will
make this validation much less expensive via use of a control that is
comprised of a large ensemble of 1 year runs.  I expect this capability
will be released within the coming year.

2) A successful error growth test will have RMS temperature differences of
roughly 1.e-11 at the initial timestep.

3) As mentioned above there have been small changes in the model releases
even though the set of parameterizations used for cam4 physics is the same.
Without some digging I can't say for sure what caused the large diffs you


Thanks Eaton for clarifying the questions and supplying the new reference runs. I have compared my output to your CESM 1.2.2 reference and all seems to be in order now, see attached figure. The version of CAM in CESM 1.2.1 is labeled "cesm1_2_1_n14_cam5_3_01" which is also 5.3.

On the other hand my CESM 1.1.2 installation uses CAM 5.2 which does not compare well against either 5.1 or 5.3 reference output. Following your explanation this comes hardly as a surprise. However since the porting process has been nearly identical for both CESM installations, we've gained some confidence that also CESM 1.1.2 is correctly functioning.

1) I see you've updated the title of the page accordingly, thanks. A standalone validation test for CAM5 would definitively be of interest for us as we're using CAM5 compsets.

2+3) In the past I have seen similar differences across different platforms and compilers. Most often this is due to different vectorization schemes. I reckoned this could be an explanation for the observed differences in this case, too. However the initial timestep should still be the same then. Therefore I'm glad the new test was successful.



Thanks for the follow-up.  The validation plot you posted looks as expected for a successful test.


Log in or register to post comments

Who's new

  • jwolff
  • tinna.gunnarsdo...
  • sarthak2235@...
  • eolivares@...
  • shubham.gandhi@...