Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

CLM 3.5 Compile Issues

bmr205@psu_edu

New Member
Hello,
Having issues with the compile:

On the Penn State High-Performance Computing clusters, such as lion-xc
and lion-xo, the include files for mpi and for netcdf are located in
/usr/global/mpich/include and /usr/global/netcdf/include. The parallel
Fortran 90 and C compilers are mpif90 and mpicc, which are located in
/usr/bin/.

I put this information on the appropriate lines in run-pc.csh (not included)
and submitted it with the command

env OPT_BLD=PGI qsub run-pc.csh

The result is that sometimes the file /usr/global/mpich/include/mpif.h
could be found, and other times not. This is documented in the MAKE.out
file, in which I have included below. It looks like the configure scripts make a
link to mpif.h where it is needed for the main source directory, but for
some subdirectories it is not happening. Even when I make a symbolic link
by hand to mpif.h in the directory where it is needed, there are other
header files within the CLM3.5 directory tree that are not found.

I don't know what environment variable to define that will cause mpif.h
to be located by all pieces of CLM3.5, and I don't know why the compiler
flag "-I/usr/global/mpich/include" is being ignored.

See portions of MAKE.out file below:

Thank You,

Brett Raczka


cat: Srcfiles: No such file or directory
Makefile:822: /home3/bct111/CLM3.5/test/clmrun/bld/Depends: No such file or directory
/home3/bct111/CLM3.5/clm3.5/bld/mkSrcfiles > /home3/bct111/CLM3.5/test/clmrun/bld/Srcfiles
/home3/bct111/CLM3.5/clm3.5/bld/mkDepends Filepath Srcfiles > /home3/bct111/CLM3.5/test/clmrun/bld/Depends
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/csm_share/shr/shr_kind_mod.F90
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/utils/esmf_wrf_timemgr/ESMF_FractionMod.F90
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/utils/esmf_wrf_timemgr/ESMF_BaseMod.F90
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/utils/mct/mpeu/m_stdio.F90
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/utils/mct/mpeu/m_chars.F90
ln: `mpif.h': File exists
ln: `mpif.h': File exists
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/utils/mct/mpeu/m_mpif.F90
ln: `mpif.h': File exists
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/utils/mct/mpeu/m_realkinds.F90
ln: `mpif.h': File exists
ln: `mpif.h': File exists
ln: `mpif.h': File exists
mpicc -c -I. -I/home3/bct111/CLM3.5/clm3.5/bld/usr.src -I/home3/bct111/CLM3.5/clm3.5/src/csm_share/shr -I/home3/bct111/CLM3.5/clm3.5/src/csm_share/eshr -I/home3/bct111/CLM3.5/clm3.5/src/utils/timing -I/home3/bct111/CLM3.5/clm3.5/src/utils/mct/mpeu -I/home3/bct111/CLM3.5/clm3.5/src/utils/mct/mct -I/home3/bct111/CLM3.5/clm3.5/src/utils/esmf_wrf_timemgr -I/home3/bct111/CLM3.5/clm3.5/src/main -I/home3/bct111/CLM3.5/clm3.5/src/biogeophys -I/home3/bct111/CLM3.5/clm3.5/src/biogeochem -I/home3/bct111/CLM3.5/clm3.5/src/riverroute -I/usr/global/netcdf/include -I/usr/global/mpich/include -DFORTRANUNDERSCORE -DMAXPATCH_PFT=4 -DOFFLINE -DSPMD -DLINUX /home3/bct111/CLM3.5/clm3.5/src/utils/timing/f_wrappers.c
fortcom: Warning: ESMF_BaseMod.F90, line 672: A dummy argument with an explicit INTENT(OUT) declaration is not given an explicit value. [TYPE]
subroutine ESMF_AttributeGetbyNumber(anytype, number, name, type, value, rc)
------------------------------------------------------------------^
>
>
>
>
>

fortcom: Error: m_mpif.F90, line 60: Cannot open include file 'mpif.h'
include "mpif.h"
------------------^
fortcom: Error: m_mpif.F90, line 35: This name does not have a type, and must have an explicit type. [MPI_INTEGER]
public :: MPI_INTEGER
--------------------^
fortcom: Error: m_mpif.F90, line 36: This name does not have a type, and must have an explicit type. [MPI_REAL]
public :: MPI_REAL
--------------------^
fortcom: Error: m_mpif.F90, line 37: This name does not have a type, and must have an explicit type. [MPI_DOUBLE_PRECISION]
public :: MPI_DOUBLE_PRECISION
--------------------^
fortcom: Error: m_mpif.F90, line 38: This name does not have a type, and must have an explicit type. [MPI_LOGICAL]
public :: MPI_LOGICAL
--------------------^
fortcom: Error: m_mpif.F90, line 39: This name does not have a type, and must have an explicit type. [MPI_CHARACTER]
public :: MPI_CHARACTER
--------------------^
fortcom: Error: m_mpif.F90, line 41: This name does not have a type, and must have an explicit type. [MPI_REAL4]
public :: MPI_REAL4
--------------------^
fortcom: Error: m_mpif.F90, line 42: This name does not have a type, and must have an explicit type. [MPI_REAL8]
public :: MPI_REAL8
--------------------^
fortcom: Error: m_mpif.F90, line 44: This name does not have a type, and must have an explicit type. [MPI_COMM_WORLD]
public :: MPI_COMM_WORLD
--------------------^
fortcom: Error: m_mpif.F90, line 45: This name does not have a type, and must have an explicit type. [MPI_COMM_NULL]
public :: MPI_COMM_NULL
--------------------^
fortcom: Error: m_mpif.F90, line 47: This name does not have a type, and must have an explicit type. [MPI_SUM]
public :: MPI_SUM
--------------------^
fortcom: Error: m_mpif.F90, line 48: This name does not have a type, and must have an explicit type. [MPI_PROD]
public :: MPI_PROD
--------------------^
fortcom: Error: m_mpif.F90, line 49: This name does not have a type, and must have an explicit type. [MPI_MIN]
public :: MPI_MIN
--------------------^
fortcom: Error: m_mpif.F90, line 50: This name does not have a type, and must have an explicit type. [MPI_MAX]
public :: MPI_MAX
--------------------^
fortcom: Error: m_mpif.F90, line 52: This name does not have a type, and must have an explicit type. [MPI_MAX_ERROR_STRING]
public :: MPI_MAX_ERROR_STRING
--------------------^
fortcom: Error: m_mpif.F90, line 53: This name does not have a type, and must have an explicit type. [MPI_STATUS_SIZE]
public :: MPI_STATUS_SIZE
--------------------^
fortcom: Error: m_mpif.F90, line 54: This name does not have a type, and must have an explicit type. [MPI_ANY_SOURCE]
public :: MPI_ANY_SOURCE
--------------------^
mpif90 -c /home3/bct111/CLM3.5/clm3.5/src/main/getdatetime.F90
compilation aborted for /home3/bct111/CLM3.5/clm3.5/src/utils/mct/mpeu/m_mpif.F90 (code 1)
gmake: *** [m_mpif.o] Error 1
gmake: *** Waiting for unfinished jobs....
 

erik

Erik Kluzek
CSEG and Liaisons
Staff member
Hi Brett

Since it appears that the compiler sometimes can find the "mpif.h" file and sometimes not -- this sounds like a system problem. The main suggestion is to talk to your systems folks and see what might be up with the machine you are working on. With cluster machines it's not uncommon for some nodes/processors to periodically "lose" the connection to a disk.

The other suggestion is to do the build interactively -- without submitting it to the que -- because sometimes run-time nodes act differently than interactive nodes. So just comment out the part where the script runs clm and run the configure/build part of the script by hand interactively by just typing the script name in.

./run-pc.csh

You might also cut back the parallelism in the make by using "gmake" instead of "gmake -j 2"

good luck

Erik
 
Top