Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Issues with Porting

morpheus

Morphy Kuffour
New Member
I am trying to create and run a cesm testcase on an HPC with the following command:

create_newcase --mach uconnhpc --case testcase0 --compset C --res T62_g17 --compiler gnu

after loading the following modules with this script
$ cat ~/.cime/loadmodules.sh
#!/usr/bin/env sh

module load netcdf
#module load hdf5
module load mpich
# module unload mpich/4.0.2
module load perl
module load intel
module load blas
module load lapack
module load cmake/3.23.2

Unfortunately I am recieving the following error after creating the testcase and attempting to build

1675103614757.png

I do not know where to retrieve the input data and the documentation recommends not downloading the input data manually. Is there a way to perform a verification that cesm is installed properly and create and run testcases. This is what I get when I cat the pio build file

1675104796106.png
any help would be greatly appreciated
 

jedwards

CSEG and Liaisons
Staff member
cmake is complaining about the path to the compiler - is that path where your compiler is located?
Have you consulted with your local system administrators?
 

morpheus

Morphy Kuffour
New Member
I should have prefaced this post by stating I am using 2 separate approaches to my port. The first is using nix-user-chroot, a tool that allows me to install nix packages in a namespace that allows the building of the test cases and the second is natively installing all the required packages on the hpc in order to compile and build the case. Anyway I am running to the following error now: 1675108456472.png
 

jedwards

CSEG and Liaisons
Staff member
This is not the same compset you were using above - but the error is due to using a cam configuration
which is not supported out of the box and so no initial condition file is provided.
 

morpheus

Morphy Kuffour
New Member
I appreciate all your time and your help. I am running into the following error now after properly loading and specifying the locations NETCDF, PNETCDF, MPICC, and MPIFC in my config_compilers.xml.

The following is my config_compilers.xml
<compiler MACH="uconnhpc" COMPILER="gnu">

<CONFIG_ARGS>
<base> --host=Linux </base>
</CONFIG_ARGS>

<NETCDF_PATH>/cm/shared/apps/netcdf/gcc/64/4.8.1</NETCDF_PATH>
<PNETCDF_PATH>/cm/shared/apps/netcdf/gcc/64/4.8.1</PNETCDF_PATH>
<MPICC>/cm/shared/apps/mpich/ge/gcc/64/3.4.2/bin/mpicc</MPICC>
<MPIFC>/cm/shared/apps/mpich/ge/gcc/64/3.4.2/bin/mpif90</MPIFC>
<SFC>gfortran</SFC>
<SCC>gcc</SCC>
<SCXX>g++</SCXX>
<FFLAGS>
<append> -fno-range-check </append>
</FFLAGS>
<SLIBS>
<append>$(shell $(NETCDF_PATH)/bin/nc-config --flibs)</append>
</SLIBS>
</compiler>

I run the following script in order to load the necessary modules
#!/usr/bin/env sh

module load netcdf
module load mpich
module load perl
module load intel
module load blas
module load lapack
module load cmake/3.23.2
module load openmpi/4.1.4

I then run
create_newcase --case ~/cases/case01 --compset B1850 --res f09_g16 --mach uconnhpc --run-unsupported
in order to create the case
I then run ./case.build --clean in the ~/cases/case01 for a sanity check
I then run ./case.build but I am running into the PIO build error:
1675131260415.png

I followed this github issue: ERROR: CIME models require NETCDF in PIO build · Issue #2936 · ESMCI/cime in hopes to figure out the error but I am stuck on the PIO error
 

jedwards

CSEG and Liaisons
Staff member
I can't tell what the error is, it may be better to attach the pio.bldlog as a txt file (just add .txt to the end of the file name)
 

morpheus

Morphy Kuffour
New Member
I have attached the pio.bldlog.txt file below
 

Attachments

  • pio.bldlog.230130-214418.txt
    3.4 KB · Views: 6

jedwards

CSEG and Liaisons
Staff member
The modules need to be listed in config_machines.xml and are loaded from there, I think that you're script is being ignored.
 

morpheus

Morphy Kuffour
New Member
I was able to load the modules properly I am running into thee following error now when I try to do a ./case.build after creating a new case

[]mok18003@login6:~/cases/case01 $ ./case.build
Building case in directory /home/mok18003/cases/case01
sharedlib_only is False
model_only is False
Generating component namelists as part of build
Creating component namelists
Calling /home/mok18003/my_cesm_sandbox/components/cam//cime_config/buildnml
...calling cam buildcpp to set build time options
CAM namelist copy: file1 /home/mok18003/cases/case01/Buildconf/camconf/atm_in file2 /gpfs/homefs1/mok18003/scratch/case01/run/atm_in
Calling /home/mok18003/my_cesm_sandbox/components/clm//cime_config/buildnml
Calling /home/mok18003/my_cesm_sandbox/components/cice//cime_config/buildnml
...calling cice buildcpp to set build time options
File not found: grid_file = "/gpfs/homefs1/mok18003/scratch/inputdata/ocn/pop/gx1v6/grid/horiz_grid_20010402.ieeer8", will attempt to download in check_input_data phase
File not found: kmt_file = "/gpfs/homefs1/mok18003/scratch/inputdata/ocn/pop/gx1v6/grid/topography_20090204.ieeei4", will attempt to download in check_input_data phase
File not found: ice_ic = "/gpfs/homefs1/mok18003/scratch/inputdata/ice/cice/b.e15.B1850G.f09_g16.pi_control.25.cice.r.0041-01-01-00000.nc", will attempt to download in check_input_data phase
Calling /home/mok18003/my_cesm_sandbox/components/pop//cime_config/buildnml
...calling pop buildcpp to set build time options
Traceback (most recent call last):
File "/home/mok18003/cases/case01/./case.build", line 147, in <module>
_main_func(__doc__)
File "/home/mok18003/cases/case01/./case.build", line 140, in _main_func
success = build.case_build(caseroot, case=case, sharedlib_only=sharedlib_only,
File "/home/mok18003/my_cesm_sandbox/cime/scripts/Tools/../../scripts/lib/CIME/build.py", line 570, in case_build
return run_and_log_case_status(functor, "case.build", caseroot=caseroot)
File "/home/mok18003/my_cesm_sandbox/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 1634, in run_and_log_case_status
rv = func()
File "/home/mok18003/my_cesm_sandbox/cime/scripts/Tools/../../scripts/lib/CIME/build.py", line 568, in <lambda>
functor = lambda: _case_build_impl(caseroot, case, sharedlib_only, model_only, buildlist,
File "/home/mok18003/my_cesm_sandbox/cime/scripts/Tools/../../scripts/lib/CIME/build.py", line 511, in _case_build_impl
sharedpath = _build_checks(case, build_threaded, comp_interface,
File "/home/mok18003/my_cesm_sandbox/cime/scripts/Tools/../../scripts/lib/CIME/build.py", line 204, in _build_checks
case.create_namelists()
File "/home/mok18003/my_cesm_sandbox/cime/scripts/Tools/../../scripts/lib/CIME/case/preview_namelists.py", line 88, in create_namelists
run_sub_or_cmd(cmd, (caseroot), "buildnml", (self, caseroot, compname), case=self)
File "/home/mok18003/my_cesm_sandbox/cime/scripts/Tools/../../scripts/lib/CIME/utils.py", line 345, in run_sub_or_cmd
getattr(mod, subname)(*subargs)
File "/home/mok18003/my_cesm_sandbox/components/pop//cime_config/buildnml", line 89, in buildnml
mod.buildcpp(case)
File "/home/mok18003/my_cesm_sandbox/components/pop/cime_config/buildcpp", line 241, in buildcpp
pop_cppdefs = determine_tracer_count(case, caseroot, srcroot, decomp_cppdefs)
File "/home/mok18003/my_cesm_sandbox/components/pop/cime_config/buildcpp", line 117, in determine_tracer_count
from MARBL_wrappers import MARBL_settings_for_POP
File "/home/mok18003/my_cesm_sandbox/components/pop/MARBL_scripts/MARBL_wrappers/__init__.py", line 1, in <module>
from MARBL_settings import MARBL_settings_for_POP
ModuleNotFoundError: No module named 'MARBL_settings'

Do I have to install any additional pacakges using pip? Any help would be greatly appreciated
 

jedwards

CSEG and Liaisons
Staff member
It looks like you don't have a complete checkout. From the top level directory try
./manage_externals/checkout_externals again do you get any errors or warnings?
 

morpheus

Morphy Kuffour
New Member
Yes I do get an error when I run
./manage_externals/checkout_externals -S

I do not get a status for each of the modules but rather I get the following error:

Checking status of externals: cam, dictionary keys changed during iteration
1677528736433.png
 

morpheus

Morphy Kuffour
New Member
I was able to run ./manage_externals/checkout_externals from my cesm clone with no errors but I am getting the same error when I run ./case.build in my testcase.

[]mok18003@login6:~/my_cesm_sandbox $ ./manage_externals/checkout_externals
Processing externals description file : Externals.cfg
Processing externals description file : Externals_CISM.cfg
Processing externals description file : Externals_CLM.cfg
Processing externals description file : Externals_POP.cfg
Checking status of externals: cam, cice, cime, cism, source_cism, clm, fates, ptclm, mosart, pop, cvmix, marbl, rtm, ww3,
Checking out externals: cam, cice, cime, cism, clm, mosart, pop, rtm, ww3,
Processing externals description file : Externals_CISM.cfg
Checking out externals: source_cism,
Processing externals description file : Externals_CLM.cfg
Checking out externals: fates, ptclm,
Processing externals description file : Externals_POP.cfg
Checking out externals: cvmix, marbl,

[]mok18003@login6:~/my_cesm_sandbox $ ./manage_externals/checkout_externals -S
Processing externals description file : Externals.cfg
Processing externals description file : Externals_CISM.cfg
Processing externals description file : Externals_CLM.cfg
Processing externals description file : Externals_POP.cfg
Checking status of externals: cam, cice, cime, cism, source_cism, clm, fates, ptclm, mosart, pop, cvmix, marbl, rtm, ww3,
./cime
./components/cam
./components/cice
./components/cism
./components/cism/source_cism
./components/clm
./components/clm/src/fates
./components/clm/tools/PTCLM
./components/mosart
./components/pop
./components/pop/externals/CVMix
./components/pop/externals/MARBL
./components/rtm
./components/ww3
 

morpheus

Morphy Kuffour
New Member
Thank you for all your help @jedwards

I made significant progress in ./case.build process. I am now running into the following error:

Error: Symbol 'gen_hash_key_offset' at (1) has no IMPLICIT type; did you mean 'gen_hashkey'?

which was also reported here but here answered. The build is taking longer than usual because I think it's actually building the test case. Maybe a known supported test case that builds. This is the command I used to create the new case:

create_newcase --case ~/cases/case01 --compset B1850 --res f09_g16 --mach uconnhpc --run-unsupported

I have also attached the build log. This is a link to my config_machines.xml and other files defined in my ~/.cime folder.
 

Attachments

  • lnd.bldlog.230227-215020.txt
    190.4 KB · Views: 1

jedwards

CSEG and Liaisons
Staff member
I'm confused by your config machines - you are specifying a gnu compiler in <COMPILERS>
but then loading modules for intel. I think you are getting the system default gnu compiler which is too old to compile this code.
I recommend you change <COMPILERS>intel</COMPILERS> and try again.
 

morpheus

Morphy Kuffour
New Member
I have been at this stage of the ./case.build process for a while. I did try to use the intel compiler specification in the xml file but that lead a compiler flag error. Are there known specific versions of the netcdf-fortran that is know to work? I appreciate all of your help

make -f /home/mok18003/my_cesm_sandbox/cime/src/share/timing/Makefile install -C /gpfs/homefs1/mok18003/scratch/case01/bld/intel/openmpi/nodebug/nothreads/mct/gptl MACFILE=/home/mok18003/cases/case01/Macros.make MODEL=gptl COMP_NAME=gptl GPTL_DIR=/home/mok18003/my_cesm_sandbox/cime/src/share/timing GPTL_LIBDIR=/gpfs/homefs1/mok18003/scratch/case01/bld/intel/openmpi/nodebug/nothreads/mct/gptl SHAREDPATH=/gpfs/homefs1/mok18003/scratch/case01/bld/intel/openmpi/nodebug/nothreads/mct CIME_MODEL=cesm SMP=FALSE CASEROOT="/home/mok18003/cases/case01" CASETOOLS="/home/mok18003/cases/case01/Tools" CIMEROOT="/home/mok18003/my_cesm_sandbox/cime" COMP_INTERFACE="mct" COMPILER="intel" DEBUG="FALSE" EXEROOT="/gpfs/homefs1/mok18003/scratch/case01/bld" INCROOT="/gpfs/homefs1/mok18003/scratch/case01/bld/lib/include" LIBROOT="/gpfs/homefs1/mok18003/scratch/case01/bld/lib" MACH="uconnhpc" MPILIB="openmpi" NINST_VALUE="c1a1l1i1o1r1g1w1i1e1" OS="LINUX" PIO_VERSION="1" SHAREDLIBROOT="/gpfs/homefs1/mok18003/scratch/case01/bld" SMP_PRESENT="FALSE" USE_ESMF_LIB="FALSE" USE_MOAB="FALSE" CAM_CONFIG_OPTS="-phys cam6 -co2_cycle" COMP_LND="clm" COMPARE_TO_NUOPC="FALSE" CISM_USE_TRILINOS="FALSE" USE_TRILINOS="FALSE" USE_ALBANY="FALSE" USE_PETSC="FALSE"
make: Entering directory '/gpfs/homefs1/mok18003/scratch/case01/bld/intel/openmpi/nodebug/nothreads/mct/gptl'
mpicc -c -I/home/mok18003/my_cesm_sandbox/cime/src/share/timing -qno-opt-dynamic-align -fp-model precise -std=gnu99 -O2 -debug minimal -DCESMCOUPLED -DFORTRANUNDERSCORE -DCPRINTEL -DMCT_INTERFACE -DHAVE_MPI /home/mok18003/my_cesm_sandbox/cime/src/share/timing/gptl.c
make: Leaving directory '/gpfs/homefs1/mok18003/scratch/case01/bld/intel/openmpi/nodebug/nothreads/mct/gptl'
gcc: error: unrecognized command-line option '-qno-opt-dynamic-align'
gcc: error: unrecognized command-line option '-fp-model'; did you mean '-fipa-modref'?
make: *** [/home/mok18003/my_cesm_sandbox/cime/src/share/timing/Makefile:81: gptl.o] Error 1
ERROR: gcc: error: unrecognized command-line option '-qno-opt-dynamic-align'
gcc: error: unrecognized command-line option '-fp-model'; did you mean '-fipa-modref'?
 

jedwards

CSEG and Liaisons
Staff member
You say that you are using the intel compiler but this error is a gnu compiler error, you need to make sure that the
intel compiler is found in your path before the gnu compiler is.
 
Top