Hi, can anyone check what this error mean..from cesm.log.? There are lots of warnings during building. Thanks.
User-specified PIO rearranger comm max pend req (comp2io), 0 (value will be reset as requested)
Resetting PIO rearranger comm max pend req (comp2io) to 64
PIO rearranger options:
comm type = p2p
comm fcd = 2denable
max pend req (comp2io) = 64
enable_hs (comp2io) = T
enable_isend (comp2io) = F
max pend req (io2comp) = 64
enable_hs (io2comp) = F
enable_isend (io2comp) = T
(seq_comm_setcomm) init ID ( 1 GLOBAL ) pelist = 0 191 1 ( npes = 192) ( nthreads = 1)( suffix =)
(seq_comm_setcomm) init ID ( 2 CPL ) pelist = 0 127 1 ( npes = 128) ( nthreads = 1)( suffix =)
(seq_comm_setcomm) init ID ( 5 ATM ) pelist = 0 127 1 ( npes = 128) ( nthreads = 1)( suffix =)
.
.
(seq_comm_joincomm) init ID ( 38 CPLIAC ) join IDs = 2 37 ( npes = 128) ( nthreads = 1)
(seq_comm_jcommarr) init ID ( 35 ALLIACID ) join multiple comp IDs ( npes = 1) ( nthreads = 1)
(seq_comm_joincomm) init ID ( 36 CPLALLIACID ) join IDs = 2 35 ( npes = 128) ( nthreads = 1)
[w-cal-liu:14154] *** An error occurred in MPI_Irecv
[w-cal-liu:14154] *** reported by process [1716453377,0]
[w-cal-liu:14154] *** on communicator MPI COMMUNICATOR 4 DUP FROM 3
[w-cal-liu:14154] *** MPI_ERR_COUNT: invalid count argument
[w-cal-liu:14154] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[w-cal-liu:14154] *** and potentially your MPI job)
----------------------------------
run command is mpirun -np 1 /home/liu/Projects/CESM_ucar_/CESM2.2.0/projects/test1/bld/cesm.exe >> cesm.log.$LID 2>&1
User-specified PIO rearranger comm max pend req (comp2io), 0 (value will be reset as requested)
Resetting PIO rearranger comm max pend req (comp2io) to 64
PIO rearranger options:
comm type = p2p
comm fcd = 2denable
max pend req (comp2io) = 64
enable_hs (comp2io) = T
enable_isend (comp2io) = F
max pend req (io2comp) = 64
enable_hs (io2comp) = F
enable_isend (io2comp) = T
(seq_comm_setcomm) init ID ( 1 GLOBAL ) pelist = 0 191 1 ( npes = 192) ( nthreads = 1)( suffix =)
(seq_comm_setcomm) init ID ( 2 CPL ) pelist = 0 127 1 ( npes = 128) ( nthreads = 1)( suffix =)
(seq_comm_setcomm) init ID ( 5 ATM ) pelist = 0 127 1 ( npes = 128) ( nthreads = 1)( suffix =)
.
.
(seq_comm_joincomm) init ID ( 38 CPLIAC ) join IDs = 2 37 ( npes = 128) ( nthreads = 1)
(seq_comm_jcommarr) init ID ( 35 ALLIACID ) join multiple comp IDs ( npes = 1) ( nthreads = 1)
(seq_comm_joincomm) init ID ( 36 CPLALLIACID ) join IDs = 2 35 ( npes = 128) ( nthreads = 1)
[w-cal-liu:14154] *** An error occurred in MPI_Irecv
[w-cal-liu:14154] *** reported by process [1716453377,0]
[w-cal-liu:14154] *** on communicator MPI COMMUNICATOR 4 DUP FROM 3
[w-cal-liu:14154] *** MPI_ERR_COUNT: invalid count argument
[w-cal-liu:14154] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[w-cal-liu:14154] *** and potentially your MPI job)
----------------------------------
run command is mpirun -np 1 /home/liu/Projects/CESM_ucar_/CESM2.2.0/projects/test1/bld/cesm.exe >> cesm.log.$LID 2>&1