Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

CCSM3 on cheyenne T42 gx1v3

ahu

New Member
Dear all,

recently I have to run a few new simulations using CCSM3 on cheyenne. With help from Feng He, I can successfully compile the code, but I can not run it. I have ask CSIL for help. They indicate that there must be a mismatch with the environment setup. I would like to know whether someone still runs CCSM3 on cheyenne recently. If yes, would you be willing to help. Thanks!

Aixue

Here are my case root and run directories:

case root: /glade/p/cgd/ccr/people/ahu/ccsm3/ccsm3_on_cheyenne/b30.232mgcha4
run directory: /glade/scratch/ahu/b30.232mgcha4

Here is the run time error:

Currently Loaded Modules:
1) pgi/20.4 2) ncarenv/1.3 3) netcdf/4.7.4 4) mpt/2.22



/glade/u/apps/ch/opt/mpt_fmods/2.22/pgi/20.4:/glade/u/apps/ch/opt/mpt/2.22/lib:/glade/u/apps/opt/pgi/20.4/linux86-64/20.4/lib:/glade/u/apps/ch/os/usr/lib64:/glade/u/apps/ch/os/usr/lib:/glade/u/apps/ch/os/lib64:/glade/u/apps/ch/os/lib:/glade/u/apps/ch/opt/netcdf/4.7.4/pgi/20.4/lib
Fri Aug 5 09:49:47 MDT 2022 -- CSM EXECUTION BEGINS HERE
[1] 4471
MPT: libxmpi.so 'HPE MPT 2.22 03/31/20 16:00:17'
asallocash failed: array services not available
mpiexec_mpt: all_launch.c:737: newash: Assertion `new_ash != old_ash' failed.
[1] Abort mpiexec_mpt -v -p [%g] -np 16 ./cpl : -np 16 ./clm : -np 48 ./pop : -np 16 ./csim : -np 32 ./cam > output (core dumped)
Fri Aug 5 09:49:47 MDT 2022 -- CSM EXECUTION HAS FINISHED
Model did not complete - see cpl.log.220805-094931
ccsm3_on_cheyenne/b30.232mgcha4> pwd
/glade/p/cgd/ccr/people/ahu/ccsm3/ccsm3_on_cheyenne/b30.232mgcha4
 

jedwards

CSEG and Liaisons
Staff member
Another issue i see is in the .run file:
#PBS -l select=4:ncpus=32:mpiprocs=32:ompthreads=32
Should be
#PBS -l select=4:ncpus=36:mpiprocs=36:ompthreads=1
 

ahu

New Member
Hi Jim,

how to set MPI_USE_ARRAY=True? which file should I modify -- the run script, build script, etc?

Thanks!

Aixue
 

ahu

New Member
I tried. But it give me an error message. Maybe I did not do it right. Here is what I did:

setenv MPI_USE_ARRAY=True

Is it right?

Thanks!
 
Top