Hello,
I am working with cesm2.1.3 on the F1850 compset. Attached is the script I used to create, setup, and build the experiment (attached; AllSpecies.csh.txt).
But when I ran the case.submit, I encounter some issue I do not really know how to solve. It seems like the issue is related to the mpiexec_mpt as posted here (https://bb.cgd.ucar.edu/cesm/threads/cheyenne-update-module-issues-for-all-cesm-versions.4681/). I removed the -p option in the env_mach_specific.xml (attached). But after resetting, cleaning, logging out, logging back again and again, the issue is still there.
CaseStatus:
2020-05-29 08:36:51: case.build success
---------------------------------------------------
2020-05-29 08:40:51: case.submit starting
---------------------------------------------------
2020-05-29 08:40:56: case.submit success case.run:2469378.chadmin1.ib0.cheyenne.ucar.edu, case.st_archive:2469379.chadmin1.ib0.cheyenne.ucar.edu
---------------------------------------------------
2020-05-29 08:40:59: case.run starting
---------------------------------------------------
2020-05-29 08:41:04: model execution starting
---------------------------------------------------
2020-05-29 08:41:07: model execution success
---------------------------------------------------
2020-05-29 08:41:07: case.run error
ERROR: RUN FAIL: Command 'mpiexec_mpt -np 72 omplace -tm open64 /glade/scratch/yujiayou/AllSpecies/bld/cesm.exe >> cesm.log.$LID 2>&1 ' failed
See log file for details: /glade/scratch/myname/AllSpecies/run/cesm.log.2469378.chadmin1.ib0.cheyenne.ucar.edu.200529-084059
cesm.log.2469378.chadmin1.ib0.cheyenne.ucar.edu.200529-084059 is attached. It seems like the issue is "MPT ERROR: MPI_COMM_WORLD rank 47 has terminated without calling MPI_Finalize() aborting job"
Does anyone have an idea on how to solve this issue? Any help is deeply appreciated!
I am working with cesm2.1.3 on the F1850 compset. Attached is the script I used to create, setup, and build the experiment (attached; AllSpecies.csh.txt).
But when I ran the case.submit, I encounter some issue I do not really know how to solve. It seems like the issue is related to the mpiexec_mpt as posted here (https://bb.cgd.ucar.edu/cesm/threads/cheyenne-update-module-issues-for-all-cesm-versions.4681/). I removed the -p option in the env_mach_specific.xml (attached). But after resetting, cleaning, logging out, logging back again and again, the issue is still there.
CaseStatus:
2020-05-29 08:36:51: case.build success
---------------------------------------------------
2020-05-29 08:40:51: case.submit starting
---------------------------------------------------
2020-05-29 08:40:56: case.submit success case.run:2469378.chadmin1.ib0.cheyenne.ucar.edu, case.st_archive:2469379.chadmin1.ib0.cheyenne.ucar.edu
---------------------------------------------------
2020-05-29 08:40:59: case.run starting
---------------------------------------------------
2020-05-29 08:41:04: model execution starting
---------------------------------------------------
2020-05-29 08:41:07: model execution success
---------------------------------------------------
2020-05-29 08:41:07: case.run error
ERROR: RUN FAIL: Command 'mpiexec_mpt -np 72 omplace -tm open64 /glade/scratch/yujiayou/AllSpecies/bld/cesm.exe >> cesm.log.$LID 2>&1 ' failed
See log file for details: /glade/scratch/myname/AllSpecies/run/cesm.log.2469378.chadmin1.ib0.cheyenne.ucar.edu.200529-084059
cesm.log.2469378.chadmin1.ib0.cheyenne.ucar.edu.200529-084059 is attached. It seems like the issue is "MPT ERROR: MPI_COMM_WORLD rank 47 has terminated without calling MPI_Finalize() aborting job"
Does anyone have an idea on how to solve this issue? Any help is deeply appreciated!
Attachments
-
AllSpecies.csh.txt5.4 KB · Views: 5
-
env_mach_specific.xml.txt5.1 KB · Views: 0
-
atm.log.2469378.chadmin1.ib0.cheyenne.ucar.edu.200529-084059.txt71.4 KB · Views: 2
-
cesm.log.2469378.chadmin1.ib0.cheyenne.ucar.edu.200529-084059.txt134.6 KB · Views: 4
-
cpl.log.2469378.chadmin1.ib0.cheyenne.ucar.edu.200529-084059.txt41.4 KB · Views: 1