Dear All,
I'm trying to run an aquaplanet (QPC6, a compset I have used extensively with f19 resolution) simulation on a custom grid using CESM 2.1.3, but the case fails on execution.
I have setup a custom 2x20 grid ( actually ~1.9x20 with nlat 96 and nlon 18 ) following 3.2 of these instructions and with guidance on this thread, starting with the attached SCRIPgrid file (created following the same for the 10x15 grid and a slightly modified version of the mkscripgrid.ncl script available at clm/tools/mkmapgrids). I have updated cam/bld/config_files/horiz_grid.xml and the config_grids.xml file (both attached). I am able to create the case (attached README.case) and build the executable successfully. I have specified the cam namelist parameters using the defaults in atm_in file of the same compset using f19 resolution (attached user_nl_cam). I have tried to create the initial condition dataset (specified in cam namelist parameter ncdata, also attached) for this grid using the same for the f19 resolution (atm/cam/inic/fv/aqua_0006-01-01_1.9x2.5_L32_c161020.nc). However, the case fails to execute, and the job exits almost immediately (attached log files).
Few points that I should mention here:
1. After getting an "incompatible domain grids" error message as follows:
(seq_mct_drv) : Performing domain checking
(seq_domain_check) --- checking ocean maskfrac ---
(seq_domain_check) --- checking atm/ocn domains ---
(seq_domain_check_grid) the domain size is = 72
(seq_domain_check_grid) maximum difference for lat 0.284217094304040E-13
(seq_domain_check_grid) maximum allowable difference for lat 0.100000000000000E-11
(seq_domain_check_grid) the domain size is = 72
(seq_domain_check_grid) maximum difference for lon 0.568434188608080E-13
(seq_domain_check_grid) maximum allowable difference for lon 0.100000000000000E-11
(seq_domain_check_grid) the domain size is = 72
(seq_domain_check_grid) maximum difference for area 0.118502941352520E-03
(seq_domain_check_grid) maximum allowable difference for area 0.900000000000000E-06
(seq_domain_check_grid) ERROR: incompatible domain grid coordinates
ERROR: (seq_domain_check_grid) incompatible domain grid coordinates
I increased EPS_AAREA to 9E-02 to progress further.
2. After this, I keep getting the following error
Lagrangian levels are crossing 9.999999999999999E-012
Run will ABORT!
Suggest to increase NSPLTVRM
ERROR: te_map: Lagrangian levels are crossing
On increasing fv_nspltvrm (along with fv_nsplit and fv_nspltrac) to a certain point (all 12), I started getting the Segmentation fault with error codes 174. I had started out with npr_yz = 30,6,6,30. On gradually decreasing this (now at 5,1,1,5) to have a lower max number of tasks required, I was able to avoid the segmentation fault. Still, the Lagrangian levels crossing error kept occurring till I hit segmentation fault again (now all nsplit, nspltrac, and nspltvrm at 96).
3. I have also tried playing around with time in user_nl_cam and ATM_NCPL (separately), but they don't seem to make much difference.
From browsing the discussion forum, I have observed that the Lagrangian levels crossing error usually occurs on WACCM runs, and I feel that it is happening here due to the "arbitrary nature" of the grid I'm generating. This also begs the question of whether it's physically possible to run CESM on this and other possible arbitrarily specified grids and if it is, what are the rules of thumb (perhaps manifest in the default values of the error tolerances) one must follow in defining such grids.
I would like to request the community to guide/assist me in resolving this issue. Any tips or pointers in this regard will be greatly appreciated.
Thanks a lot,
Abu Bakar Siddiqui
I'm trying to run an aquaplanet (QPC6, a compset I have used extensively with f19 resolution) simulation on a custom grid using CESM 2.1.3, but the case fails on execution.
I have setup a custom 2x20 grid ( actually ~1.9x20 with nlat 96 and nlon 18 ) following 3.2 of these instructions and with guidance on this thread, starting with the attached SCRIPgrid file (created following the same for the 10x15 grid and a slightly modified version of the mkscripgrid.ncl script available at clm/tools/mkmapgrids). I have updated cam/bld/config_files/horiz_grid.xml and the config_grids.xml file (both attached). I am able to create the case (attached README.case) and build the executable successfully. I have specified the cam namelist parameters using the defaults in atm_in file of the same compset using f19 resolution (attached user_nl_cam). I have tried to create the initial condition dataset (specified in cam namelist parameter ncdata, also attached) for this grid using the same for the f19 resolution (atm/cam/inic/fv/aqua_0006-01-01_1.9x2.5_L32_c161020.nc). However, the case fails to execute, and the job exits almost immediately (attached log files).
Few points that I should mention here:
1. After getting an "incompatible domain grids" error message as follows:
(seq_mct_drv) : Performing domain checking
(seq_domain_check) --- checking ocean maskfrac ---
(seq_domain_check) --- checking atm/ocn domains ---
(seq_domain_check_grid) the domain size is = 72
(seq_domain_check_grid) maximum difference for lat 0.284217094304040E-13
(seq_domain_check_grid) maximum allowable difference for lat 0.100000000000000E-11
(seq_domain_check_grid) the domain size is = 72
(seq_domain_check_grid) maximum difference for lon 0.568434188608080E-13
(seq_domain_check_grid) maximum allowable difference for lon 0.100000000000000E-11
(seq_domain_check_grid) the domain size is = 72
(seq_domain_check_grid) maximum difference for area 0.118502941352520E-03
(seq_domain_check_grid) maximum allowable difference for area 0.900000000000000E-06
(seq_domain_check_grid) ERROR: incompatible domain grid coordinates
ERROR: (seq_domain_check_grid) incompatible domain grid coordinates
I increased EPS_AAREA to 9E-02 to progress further.
2. After this, I keep getting the following error
Lagrangian levels are crossing 9.999999999999999E-012
Run will ABORT!
Suggest to increase NSPLTVRM
ERROR: te_map: Lagrangian levels are crossing
On increasing fv_nspltvrm (along with fv_nsplit and fv_nspltrac) to a certain point (all 12), I started getting the Segmentation fault with error codes 174. I had started out with npr_yz = 30,6,6,30. On gradually decreasing this (now at 5,1,1,5) to have a lower max number of tasks required, I was able to avoid the segmentation fault. Still, the Lagrangian levels crossing error kept occurring till I hit segmentation fault again (now all nsplit, nspltrac, and nspltvrm at 96).
3. I have also tried playing around with time in user_nl_cam and ATM_NCPL (separately), but they don't seem to make much difference.
From browsing the discussion forum, I have observed that the Lagrangian levels crossing error usually occurs on WACCM runs, and I feel that it is happening here due to the "arbitrary nature" of the grid I'm generating. This also begs the question of whether it's physically possible to run CESM on this and other possible arbitrarily specified grids and if it is, what are the rules of thumb (perhaps manifest in the default values of the error tolerances) one must follow in defining such grids.
I would like to request the community to guide/assist me in resolving this issue. Any tips or pointers in this regard will be greatly appreciated.
Thanks a lot,
Abu Bakar Siddiqui