Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Can I get some guidance on which version to port?

cdevaneprugh

Cooper
Member
Hello, my research group is looking to port CTSM or CLM. I have successfully ported CESM 2.1.5 to our HPC but it does not give us all the functionality we are looking for. We need the ability to do spin ups, as well as restarts in single point mode. Based on the demo found here, we need access to the tools for creating input datasets found in $CTSM/tools. Based on the README in that directory, it looks like any version after ctsm5.1.dev158 should be fine? So would the latest ctsm5.3.012 tag work for our purposes?

Any advice is appreciated. I should be able to make any version work with our HPC. It's just a matter of narrowing down the version I need to focus on so I can work with our HPC system administrators.
 

oleson

Keith Oleson
CSEG and Liaisons
Staff member
Yes, I think ctsm5.3.012 would work for your purposes. You would be able to do single-point, regional, and global land-only simulations with it. You can create surface datasets with the mksurfdata_esmf tool, as well as subset the global data for single-point and regional simulations using the subset_data tool.
 

xiaoxiaokuishu

Ru Xu
Member
Yes, I think ctsm5.3.012 would work for your purposes. You would be able to do single-point, regional, and global land-only simulations with it. You can create surface datasets with the mksurfdata_esmf tool, as well as subset the global data for single-point and regional simulations using the subset_data tool.
Hi, Oleson,

I have a question about the ./subset_data tools,
(1) is it only for CTSM, as I did not find anything under clm5.0.
(2) I have created a surface,domain(?) files from CTSM5.3 ./subset_data tools and try to use it under clm5.0,
I thought maybe there is a problem, the subset_data did not create domainfile file, but CLM5.0 require this file.

Do you have any suggestions for make domainfile easily under CLM5.0, as the tools under clm5.0 seems not easy to use
as the ./subset_data tools.. but ./subset_data did not create domainfile required by CLM5.0..

Best
Ru
 

cdevaneprugh

Cooper
Member
I've got a follow up question regarding building some of the CTSM tools. In $CTSM/tools/mksurfdata_esmf I'm trying to run ./gen_mksurfdata_build but get an error that:

Code:
The PIO directory for the PIO build is required and was not set in the configure
Make sure a PIO build is provided for gnu with openmpi in config_machines

I built the PIO library in $CTSM/libraries/parallelio and tried setting a PIO_LIBDIR environment variable in config_machines as well as the cmake file for our computer but have had no luck. I was going to start messing with the config_pio.xml in ccs_config/machines but figure I would ask here first. How should I proceed?
 

slevis

Moderator
Staff member
@xiaoxiaokuishu the subset_data tool was developed long after clm5, so its outputs may not be compatible with clm5. If from your experience that is the case, then you will need to use clm5-specific tools to accomplish your tasks.

@cdevaneprugh this seems like a porting issue, so I recommend asking for help in the "porting" forum.
 

Lumoss

Member
I've got a follow up question regarding building some of the CTSM tools. In $CTSM/tools/mksurfdata_esmf I'm trying to run ./gen_mksurfdata_build but get an error that:

Code:
The PIO directory for the PIO build is required and was not set in the configure
Make sure a PIO build is provided for gnu with openmpi in config_machines

I built the PIO library in $CTSM/libraries/parallelio and tried setting a PIO_LIBDIR environment variable in config_machines as well as the cmake file for our computer but have had no luck. I was going to start messing with the config_pio.xml in ccs_config/machines but figure I would ask here first. How should I proceed?
I have encountered the same problem as you. Please let me know if you have a solution. Thank you!
 

Lumoss

Member
I have encountered the same problem as you. Please let me know if you have a solution. Thank you!
add:
if(DEFINED ENV{PIO})
set(PIO_LIBDIR "$ENV{PIO}/lib")
set(PIO_INCDIR "$ENV{PIO}/include")
endif()
to cmake_macros/*.cmake

and add <env name="PIO">your path</env> to your config_machines.xml
 

cdevaneprugh

Cooper
Member
add:
if(DEFINED ENV{PIO})
set(PIO_LIBDIR "$ENV{PIO}/lib")
set(PIO_INCDIR "$ENV{PIO}/include")
endif()
to cmake_macros/*.cmake

and add <env name="PIO">your path</env> to your config_machines.xml

This definitely made some progress. When I run the script it looks like it detects the libraries it needs.

Code:
-- The Fortran compiler identification is GNU 12.2.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /apps/mpi/gcc/12.2.0/openmpi/4.1.6/bin/mpif90 - skipped
-- Found MPI_Fortran: /apps/mpi/gcc/12.2.0/openmpi/4.1.6/bin/mpif90 (found version "3.1")
-- Found MPI: TRUE (found version "3.1") 
-- Found NetCDF: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2/include;/apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1/include (found suitable version "4.9.2", minimum required is "4.7.4") found components: Fortran
-- FindNetCDF defines targets:
--   - NetCDF_VERSION [4.9.2]
--   - NetCDF_PARALLEL [TRUE]
--   - NetCDF_C_CONFIG_EXECUTABLE [/apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2/bin/nc-config]
--   - NetCDF::NetCDF_C [SHARED] [Root: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2] Lib: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2/lib64/libnetcdf.so
--   - NetCDF_Fortran_CONFIG_EXECUTABLE [/apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1/bin/nf-config]
--   - NetCDF::NetCDF_Fortran [SHARED] [Root: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1] Lib: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1/lib/libnetcdff.so
-- Found ESMF library: /apps/gcc/12.2.0/openmpi/4.1.6/esmf/8.7.0/lib/libO/Linux.gfortran.64.openmpi.default/libesmf.a
-- Found ESMF: /apps/gcc/12.2.0/openmpi/4.1.6/esmf/8.7.0/lib/libO/Linux.gfortran.64.openmpi.default (found suitable version "8.7.0", minimum required is "8.2.0") 
-- Configuring done (2.8s)
-- Generating done (0.0s)
-- Build files have been written to: /blue/gerber/earth_models/ctsm5.3/tools/mksurfdata_esmf/tool_bld

But then it also spits out a bunch of errors like this at me. Not sure if there is a specific way I need to build the PIO library, or if this is something else.

Code:
[  2%] Building Fortran object CMakeFiles/mksurfdata.dir/shr_kind_mod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[  5%] Building Fortran object CMakeFiles/mksurfdata.dir/mkchecksMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[  7%] Building Fortran object CMakeFiles/mksurfdata.dir/shr_sys_mod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 10%] Building Fortran object CMakeFiles/mksurfdata.dir/mkutilsMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 13%] Building Fortran object CMakeFiles/mksurfdata.dir/shr_const_mod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 15%] Building Fortran object CMakeFiles/mksurfdata.dir/mkvarpar.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 18%] Building Fortran object CMakeFiles/mksurfdata.dir/mkesmfMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 21%] Building Fortran object CMakeFiles/mksurfdata.dir/mkvarctl.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 23%] Building Fortran object CMakeFiles/mksurfdata.dir/mkdiagnosticsMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
/blue/gerber/earth_models/ctsm5.3/tools/mksurfdata_esmf/src/mkdiagnosticsMod.F90:238:20:

  238 |     call mpi_reduce(loc_gdata_i, gdata_i, 1, MPI_REAL8, MPI_SUM, 0, mpicom, ier)
      |                    1
......
  420 |     call mpi_reduce(loc_garea_i, garea_i, size(garea_i), MPI_REAL8, MPI_SUM, 0, mpicom, ier)
      |                    2
Error: Rank mismatch between actual argument at (1) and actual argument at (2) (rank-1 and scalar)

make[2]: *** [CMakeFiles/mksurfdata.dir/build.make:101: CMakeFiles/mksurfdata.dir/mkdiagnosticsMod.F90.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:83: CMakeFiles/mksurfdata.dir/all] Error 2
make: *** [Makefile:136: all] Error 2
Error doing make for hipergator openmpi gnu

@Lumoss Were you able to build everything successfully?
 

Lumoss

Member
This definitely made some progress. When I run the script it looks like it detects the libraries it needs.

Code:
-- The Fortran compiler identification is GNU 12.2.0
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - done
-- Check for working Fortran compiler: /apps/mpi/gcc/12.2.0/openmpi/4.1.6/bin/mpif90 - skipped
-- Found MPI_Fortran: /apps/mpi/gcc/12.2.0/openmpi/4.1.6/bin/mpif90 (found version "3.1")
-- Found MPI: TRUE (found version "3.1")
-- Found NetCDF: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2/include;/apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1/include (found suitable version "4.9.2", minimum required is "4.7.4") found components: Fortran
-- FindNetCDF defines targets:
--   - NetCDF_VERSION [4.9.2]
--   - NetCDF_PARALLEL [TRUE]
--   - NetCDF_C_CONFIG_EXECUTABLE [/apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2/bin/nc-config]
--   - NetCDF::NetCDF_C [SHARED] [Root: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2] Lib: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-c/4.9.2/lib64/libnetcdf.so
--   - NetCDF_Fortran_CONFIG_EXECUTABLE [/apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1/bin/nf-config]
--   - NetCDF::NetCDF_Fortran [SHARED] [Root: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1] Lib: /apps/gcc/12.2.0/openmpi/4.1.6/netcdf-f/4.6.1/lib/libnetcdff.so
-- Found ESMF library: /apps/gcc/12.2.0/openmpi/4.1.6/esmf/8.7.0/lib/libO/Linux.gfortran.64.openmpi.default/libesmf.a
-- Found ESMF: /apps/gcc/12.2.0/openmpi/4.1.6/esmf/8.7.0/lib/libO/Linux.gfortran.64.openmpi.default (found suitable version "8.7.0", minimum required is "8.2.0")
-- Configuring done (2.8s)
-- Generating done (0.0s)
-- Build files have been written to: /blue/gerber/earth_models/ctsm5.3/tools/mksurfdata_esmf/tool_bld

But then it also spits out a bunch of errors like this at me. Not sure if there is a specific way I need to build the PIO library, or if this is something else.

Code:
[  2%] Building Fortran object CMakeFiles/mksurfdata.dir/shr_kind_mod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[  5%] Building Fortran object CMakeFiles/mksurfdata.dir/mkchecksMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[  7%] Building Fortran object CMakeFiles/mksurfdata.dir/shr_sys_mod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 10%] Building Fortran object CMakeFiles/mksurfdata.dir/mkutilsMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 13%] Building Fortran object CMakeFiles/mksurfdata.dir/shr_const_mod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 15%] Building Fortran object CMakeFiles/mksurfdata.dir/mkvarpar.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 18%] Building Fortran object CMakeFiles/mksurfdata.dir/mkesmfMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 21%] Building Fortran object CMakeFiles/mksurfdata.dir/mkvarctl.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
[ 23%] Building Fortran object CMakeFiles/mksurfdata.dir/mkdiagnosticsMod.F90.o
f951: Warning: Nonexistent include directory ‘/include’ [-Wmissing-include-dirs]
/blue/gerber/earth_models/ctsm5.3/tools/mksurfdata_esmf/src/mkdiagnosticsMod.F90:238:20:

  238 |     call mpi_reduce(loc_gdata_i, gdata_i, 1, MPI_REAL8, MPI_SUM, 0, mpicom, ier)
      |                    1
......
  420 |     call mpi_reduce(loc_garea_i, garea_i, size(garea_i), MPI_REAL8, MPI_SUM, 0, mpicom, ier)
      |                    2
Error: Rank mismatch between actual argument at (1) and actual argument at (2) (rank-1 and scalar)

make[2]: *** [CMakeFiles/mksurfdata.dir/build.make:101: CMakeFiles/mksurfdata.dir/mkdiagnosticsMod.F90.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:83: CMakeFiles/mksurfdata.dir/all] Error 2
make: *** [Makefile:136: all] Error 2
Error doing make for hipergator openmpi gnu

@Lumoss Were you able to build everything successfully?
The version I'm using is ctsm5.2. After modification, I was able to successfully build mksurfdata on my machine without encountering your problem.
Bash:
[100%] Linking Fortran executable mksurfdata
[100%] Built target mksurfdata






Successfully created mksurfdata_esmf executable for: Lumos_intel-oneapi for impi library
 
Top