Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

ccsm.exe error despite LD_LIBRARY_PATH

ruk15@psu_edu

New Member
Hi,I am getting the following error, even though I set LD_LIBRARY_PATH in my .bashrc file:export LD_LIBRARY_PATH=/gscratch/vsm/ravi/local/lib:$LD_LIBRARY_PATH

I also have this in my 'env_mach_specific':setenv NETCDF_PATH /gscratch/vsm/ravi/local

I even checked if the LD_LIBRARY_PATH is set on the prompt:echo $LD_LIBRARY_PATH
/gscratch/vsm/ravi/local/lib::/sw/openmpi/lib:/gscratch/vsm/ravi/libcurlBut still I get the following error while execution. The code runs for about 25 seconds and terminates the job../ccsm.exe: error while loading shared libraries: libnetcdff.so.5: cannot open shared object file: No such file or directory ThanksRaviPennsylvania State University
 

jedwards

CSEG and Liaisons
Staff member
Try putting the LD_LIBRARY_PATH into the env_mach_specific file.    You can also you the -Wl,-rpath flag to the linker by adding it to the Macros file.   Make sure that the libnetcdff.so.5 is in the specified directory.
 

ruk15@psu_edu

New Member
Thank you, I did that and the build failed with the following error:******************CCSM PRESTAGE SCRIPT HAS FINISHED SUCCESSFULLY
-------------------------------------------------------------------------
 CCSM BUILDEXE SCRIPT STARTING
 - Build Libraries: mct pio csm_share
Wed Jan 22 07:08:36 PST 2014 /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/mct.bldlog.140122-070810
ERROR: mct.buildlib failed, see /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/mct.bldlog.140122-070810
ERROR: cat /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/mct.bldlog.140122-070810*************************I think this is something related to the mpi path. But when I removed the LD_LIBRARY_PATH from env_mach_specific, the build went well (but of course, we are bakc to the original ./ccsm.exe error that libnetcdff.so.5 is not found). Oh, and my libnetcdff.so.5 is in  /gscratch/vsm/ravi/local/lib  directory which is where I was pointing to in the LD path. Not sure what's happening.RaviPennsylvania State University
 

jedwards

CSEG and Liaisons
Staff member
Look for the error in the config.log file in the directory  /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/
 

ruk15@psu_edu

New Member
Thanks, In the beginning of the  config.log file, it says the following. I did load mpi modules. I have given the complete 'config.log' down below. The 'libimf.so' is the only error (after I included LD_LIBRARY_PATH in 'env_mach_specific'. ********************configure:1624: checking for C compiler version
configure:1627: mpicc --version &5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory******************* The total config.log file: --------------------------This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.

It was created by MCT configure 2.0, which was
generated by GNU Autoconf 2.59.  Invocation command line was

  $ ./configure CC=mpicc FC=mpif90 F90=mpif90 INCLUDEPATH=-I/include

## --------- ##
## Platform. ##
## --------- ##

hostname = n0365
uname -m = x86_64
uname -r = 2.6.18-371.3.1.el5
uname -s = Linux
uname -v = #1 SMP Mon Nov 11 03:23:58 EST 2013

/usr/bin/uname -p = unknown
/bin/uname -X     = unknown

/bin/arch              = x86_64
/usr/bin/arch -k       = unknown
/usr/convex/getsysinfo = unknown
hostinfo               = unknown
/bin/machine           = unknown
/usr/bin/oslevel       = unknown
/bin/universe          = unknown

PATH: /sw/openmpi-1.6.2_icc-13.0/bin
PATH: /sw/intel/composer_xe_2013.0.079/bin/intel64
PATH: /sw/intel/composer_xe_2013.0.079/mpirt/bin/intel64
PATH: /sw/openmpi-1.6.4_icc-13.1/bin
PATH: /sw/intel/composer_xe_2013.2.146/bin/intel64
PATH: /sw/intel/composer_xe_2013.2.146/mpirt/bin/intel64
PATH: /sw/openmpi/bin
PATH: /sw/local/bin
PATH: /sw/gnu-parallel/bin
PATH: /opt/mx/bin
PATH: /sw/Modules/3.2.9/bin
PATH: /sw/moab/bin
PATH: /sw/moab/sbin
PATH: /sw/torque/bin
PATH: /sw/torque/sbin
PATH: /usr/kerberos/bin
PATH: /usr/lpp/mmfs/bin
PATH: /usr/local/bin
PATH: /bin
PATH: /usr/bin
PATH: /usr/lusers/rkoppara/bin


## ----------- ##
## Core tests. ##
## ----------- ##

configure:1624: checking for C compiler version
configure:1627: mpicc --version &5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1630: $? = 127
configure:1632: mpicc -v &5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1635: $? = 127
configure:1637: mpicc -V &5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1640: $? = 127
configure:1663: checking for C compiler default output file name
configure:1666: mpicc    conftest.c  >&5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1669: $? = 127
configure: failed program was:
| /* confdefs.h.  */
|
| #define PACKAGE_NAME "MCT"
| #define PACKAGE_TARNAME "mct"
| #define PACKAGE_VERSION "2.0"
| #define PACKAGE_STRING "MCT 2.0"
| #define PACKAGE_BUGREPORT ""
| /* end confdefs.h.  */
|
| int
| main ()
| {
|
|   ;
|   return 0;
| }
configure:1708: error: C compiler cannot create executables
See `config.log' for more details.

## ---------------- ##
## Cache variables. ##
## ---------------- ##

ac_cv_env_ALLCFLAGS_set=
ac_cv_env_ALLCFLAGS_value=
ac_cv_env_AR_set=
ac_cv_env_AR_value=
ac_cv_env_BABELROOT_set=
ac_cv_env_BABELROOT_value=
ac_cv_env_BIT64_set=
ac_cv_env_BIT64_value=
ac_cv_env_CC_set=set
ac_cv_env_CC_value=mpicc
ac_cv_env_CFLAGS_set=
ac_cv_env_CFLAGS_value=
ac_cv_env_COMPILER_ROOT_set=
ac_cv_env_COMPILER_ROOT_value=
ac_cv_env_CPPFLAGS_set=
ac_cv_env_CPPFLAGS_value=
ac_cv_env_ENDIAN_set=
ac_cv_env_ENDIAN_value=
ac_cv_env_F90FLAGS_set=
ac_cv_env_F90FLAGS_value=
ac_cv_env_F90SUFFIX_set=
ac_cv_env_F90SUFFIX_value=
ac_cv_env_F90_set=set
ac_cv_env_F90_value=mpif90
ac_cv_env_FCFLAGS_set=
ac_cv_env_FCFLAGS_value=
ac_cv_env_FC_set=set
ac_cv_env_FC_value=mpif90
ac_cv_env_FORT_SIZE_set=
ac_cv_env_FORT_SIZE_value=
ac_cv_env_FPPFLAGS_set=
ac_cv_env_FPPFLAGS_value=
ac_cv_env_FPP_set=
ac_cv_env_FPP_value=
ac_cv_env_INCLUDEFLAG_set=
ac_cv_env_INCLUDEFLAG_value=
ac_cv_env_INCLUDEPATH_set=set
ac_cv_env_INCLUDEPATH_value=-I/include
ac_cv_env_LDFLAGS_set=
ac_cv_env_LDFLAGS_value=
ac_cv_env_MACHDEFS_set=
ac_cv_env_MACHDEFS_value=
ac_cv_env_MPIF90_set=
ac_cv_env_MPIF90_value=
ac_cv_env_MPIHEADER_set=
ac_cv_env_MPIHEADER_value=
ac_cv_env_MPILIBS_set=
ac_cv_env_MPILIBS_value=
ac_cv_env_OPT_set=
ac_cv_env_OPT_value=
ac_cv_env_REAL8_set=
ac_cv_env_REAL8_value=
ac_cv_env_build_alias_set=
ac_cv_env_build_alias_value=
ac_cv_env_host_alias_set=
ac_cv_env_host_alias_value=
ac_cv_env_target_alias_set=
ac_cv_env_target_alias_value=

## ----------------- ##
## Output variables. ##
## ----------------- ##

ALLCFLAGS=''
AR=''
BABELROOT=''
BIT64=''
CC='mpicc'
CFLAGS=''
COMPILER_ROOT=''
CPPFLAGS=''
CRULE=''
DEFS=''
ECHO_C=''
ECHO_N='-n'
ECHO_T=''
ENDIAN=''
EXEEXT=''
F90='mpif90'
F90FLAGS=''
F90RULE=''
F90RULECPP=''
F90SUFFIX=''
FC='mpif90'
FCFLAGS=''
FORT_SIZE=''
FPP=''
FPPFLAGS=''
INCLUDEFLAG=''
INCLUDEPATH='-I/include'
LDFLAGS=''
LIBOBJS=''
LIBS=''
LTLIBOBJS=''
MACHDEFS=''
MPIF90=''
MPIHEADER=''
MPILIBS=''
MPISERPATH=''
OBJEXT=''
OPT=''
PACKAGE_BUGREPORT=''
PACKAGE_NAME='MCT'
PACKAGE_STRING='MCT 2.0'
PACKAGE_TARNAME='mct'
PACKAGE_VERSION='2.0'
PATH_SEPARATOR=':'
PYTHON=''
PYTHONOPTS=''
REAL8=''
SHELL='/bin/sh'
ac_ct_CC=''
ac_ct_F90=''
bindir='${exec_prefix}/bin'
build_alias=''
datadir='${prefix}/share'
exec_prefix='NONE'
host_alias=''
includedir='${prefix}/include'
infodir='${prefix}/info'
libdir='${exec_prefix}/lib'
libexecdir='${exec_prefix}/libexec'
localstatedir='${prefix}/var'
mandir='${prefix}/man'
oldincludedir='/usr/include'
prefix='NONE'
program_transform_name='s,x,x,'
sbindir='${exec_prefix}/sbin'
sharedstatedir='${prefix}/com'
sysconfdir='${prefix}/etc'
target_alias=''

## ----------- ##
## confdefs.h. ##
## ----------- ##

#define PACKAGE_BUGREPORT ""
#define PACKAGE_NAME "MCT"
#define PACKAGE_STRING "MCT 2.0"
#define PACKAGE_TARNAME "mct"
#define PACKAGE_VERSION "2.0"

configure: exit 77
--------------------------
 

ruk15@psu_edu

New Member
At least I solved the build problem now. The issue is that in the 'env_mach_specific', I did the following:setenv LD_LIBRARY_PATH /gscratch/vsm/ravi/local/libThe correct syntax is:setenv LD_LIBRARY_PATH /gscratch/vsm/ravi/local/lib:${LD_LIBRARY_PATH}Now it builds fine, but gives the following error when the code is executed:LD_LIBRARY_PATH: Undefined variable.

It does not recognize this variable in the 'env_mach_specific' for some reason.ThanksRavi 
 

jedwards

CSEG and Liaisons
Staff member
This has more to do with the configuration of your machine than with CESM - I would guess that the environment from your login node is not being exported to your compute nodes andLD_LIBRARY_PATH is not defined on the compute nodes.   
 

rajkmsaini

Dr. Raj Saini
Member
Dear Jedwards,

I am new for this CESM.

$ ./cesm_setup
Macros script already created ...skipping
Machine/Decomp/Pes configuration has already been done ...skipping
Running preview_namelist script
LD_LIBRARY_PATH: Undefined variable.
ERROR: /scratch/am/irdstaff/ird12765/cas/raj/IITD_COE_EXP1/preview_namelists failed: 65280

I am getting same type of error.

Can you help me where I need to change? In the folloing below:


======================================
setenv intelhome /home/soft/intel2015
setenv intehome /home/soft/intel2015
setenv CC $intehome/bin/icc
setenv FC $intelhome/bin/ifort
setenv F90 $intelhome/bin/ifort
setenv CXX $intelhome/bin/icpc
setenv PATH $intelhome/bin:$PATH
setenv LD_LIBRARY_PATH $intelhome/lib/intel64:$LD_LIBRARY_PATH
setenv MANPATH $intelhome/man:$MANPATH

#PGI mpi environment
setenv PATH $intelhome/impi/5.0.3.048/intel64/bin:$PATH
setenv LD_LIBRARY_PATH $intelhome/impi/5.0.3.048/intel64/lib:$LD_LIBRARY_PATH
setenv INCLUDE_PATH $intelhome/impi/5.0.3.048/intel64/include:$INCLUDE_PATH

#Deps
setenv PATH /home/soft/centOS/apps/wrf/impi20152/bin:$PATH
setenv LD_LIBRARY_PATH /home/soft/centOS/apps/wrf/impi20152/lib:$LD_LIBRARY_PATH
setenv MANPATH $PREFIX/share/man:$MANPATH
setenv PKG_CONFIG_PATH $PREFIX/lib/pkgconfig:$PKG_CONFIG_PATH
setenv LD_RUN_PATH /home/soft/centOS/apps/wrf/impi20152/lib:$LD_RUN_PATH

#Extra
setenv HDF5_INCLUDE_DIRS /home/soft/centOS/apps/wrf/impi20152/include
setenv HDF5_ROOT /home/soft/centOS/apps/wrf/impi20152/dep2
setenv HDF5_PATH /home/soft/centOS/apps/wrf/impi20152/dep2
setenv HDF5_LIBRARY_PATH /home/soft/centOS/apps/wrf/impi20152/lib
setenv INC_HDF5 /home/soft/centOS/apps/wrf/impi20152/include
setenv NETCDF_PATH /home/soft/centOS/apps/wrf/impi20152
setenv NetCDF /home/soft/centOS/apps/wrf/impi20152
setenv NETCDF /home/soft/centOS/apps/wrf/impi20152
setenv PNETCDF_PATH /home/soft/centOS/apps/wrf/impi20152

================================================================


Thank you in advanced.


Thanks,
Best,
Raj
 
Top