kunifuda@g_ecc_u-tokyo_ac_jp
New Member
Hello, I've been attempting to port CESM1_2_2 to my laptop (macbook pro). I created this case. $./create_newcase -case ~/cesm_cases/test06 -res f45_g37 -compset X -mach userdefined $./xmlchange -file env_build.xml -id OS -val darwin$./xmlchange -file env_build.xml -id MPILIB -val openmpi$./xmlchange -file env_build.xml -id COMPILER -val gnu$./xmlchange -file env_build.xml -id EXEROOT -val ~/cesm_exe/test06/bld$./xmlchange -file env_build.xml -id CESMSCRATCHROOT -val ~/cesm_exe$./xmlchange -file env_run.xml -id RUNDIR -val ~/cesm_exe/test06/run$./xmlchange -file env_run.xml -id DIN_LOC_ROOT -val ~/cesm_input$./xmlchange -file env_mach_pes.xml -id MAX_TASKS_PER_NODE -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_CPL -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_ROF -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_ICE -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_WAV -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_OCN -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_GLC -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_LND -val 1$./xmlchange -file env_mach_pes.xml -id NTASKS_ATM -val 1 Build seems to be completed, but I received a error message when I tried to run the case. This is the part of cesm.log.
pio_support::pio_die:: myrank= -1 : ERROR: nf_mod.F90: 1229 : NETCDF not enabled in the build--------------------------------------------------------------------------MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.You may or may not see output from other processes, depending on exactly when Open MPI kills them.--------------------------------------------------------------------------
How do I correct this error?Thanks for any help!
Hiroki KunifudaThe University of Tokyo
pio_support::pio_die:: myrank= -1 : ERROR: nf_mod.F90: 1229 : NETCDF not enabled in the build--------------------------------------------------------------------------MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.You may or may not see output from other processes, depending on exactly when Open MPI kills them.--------------------------------------------------------------------------
How do I correct this error?Thanks for any help!
Hiroki KunifudaThe University of Tokyo