We are getting an MPI_ABORT when we do a case.submit. I am trying to set up a working test case to make sure cesm is functioning as expected. This is the newcase I am using:
scripts/create_newcase --case /scratch/$USER/cesmtest --res f09_g17 --compset I1850Clm50Sp --machine monsoon -i /scratch/$USER/cesminput
I have the monsoon machine set up in the xml files.
the case setup and case build appear to work fine but the submit ends with the following error:
(seq_comm_printcomms) 38 0 32 1 CPLIAC:
(t_initf) Read in prof_inparm namelist from: drv_in
(t_initf) Using profile_disable= F
(t_initf) profile_timer= 4
(t_initf) profile_depth_limit= 4
(t_initf) profile_detail_limit= 2
(t_initf) profile_barrier= F
(t_initf) profile_outpe_num= 1
(t_initf) profile_outpe_stride= 0
(t_initf) profile_single_file= F
(t_initf) profile_global_stats= T
(t_initf) profile_ovhd_measurement= F
(t_initf) profile_add_detail= F
(t_initf) profile_papi_enable= F
ERROR: (shr_stream_getCalendar) ERROR: nf90_open file /scratch/wew/cesminput/lmwg/atm_forcing.datm7.GSWP3.0.5d.v1.
c170516/Solar/clmforc.GSWP3.c2011.0.5x0.5.Solr.1901-01.nc
#0 0x145c651a7131 in ???
#1 0xc3218e in ???
#2 0xc3235b in ???
#3 0xd24143 in ???
#4 0xd2c9c4 in ???
#5 0xd0b9f4 in ???
#6 0x4b99a4 in ???
#7 0x4ae674 in ???
#8 0x423daa in ???
#9 0x413b08 in ???
#10 0x42150a in ???
#11 0x145c643ee6a2 in ???
#12 0x4065fd in ???
#13 0xffffffffffffffff in ???
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1001.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
What might be causing this error?
scripts/create_newcase --case /scratch/$USER/cesmtest --res f09_g17 --compset I1850Clm50Sp --machine monsoon -i /scratch/$USER/cesminput
I have the monsoon machine set up in the xml files.
the case setup and case build appear to work fine but the submit ends with the following error:
(seq_comm_printcomms) 38 0 32 1 CPLIAC:
(t_initf) Read in prof_inparm namelist from: drv_in
(t_initf) Using profile_disable= F
(t_initf) profile_timer= 4
(t_initf) profile_depth_limit= 4
(t_initf) profile_detail_limit= 2
(t_initf) profile_barrier= F
(t_initf) profile_outpe_num= 1
(t_initf) profile_outpe_stride= 0
(t_initf) profile_single_file= F
(t_initf) profile_global_stats= T
(t_initf) profile_ovhd_measurement= F
(t_initf) profile_add_detail= F
(t_initf) profile_papi_enable= F
ERROR: (shr_stream_getCalendar) ERROR: nf90_open file /scratch/wew/cesminput/lmwg/atm_forcing.datm7.GSWP3.0.5d.v1.
c170516/Solar/clmforc.GSWP3.c2011.0.5x0.5.Solr.1901-01.nc
#0 0x145c651a7131 in ???
#1 0xc3218e in ???
#2 0xc3235b in ???
#3 0xd24143 in ???
#4 0xd2c9c4 in ???
#5 0xd0b9f4 in ???
#6 0x4b99a4 in ???
#7 0x4ae674 in ???
#8 0x423daa in ???
#9 0x413b08 in ???
#10 0x42150a in ???
#11 0x145c643ee6a2 in ???
#12 0x4065fd in ???
#13 0xffffffffffffffff in ???
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1001.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
What might be causing this error?