Main menu

Navigation

ccsm.exe error despite LD_LIBRARY_PATH

7 posts / 0 new
Last post
ruk15@...
ccsm.exe error despite LD_LIBRARY_PATH

Hi,

I am getting the following error, even though I set LD_LIBRARY_PATH in my .bashrc file:

export LD_LIBRARY_PATH=/gscratch/vsm/ravi/local/lib:$LD_LIBRARY_PATH

I also have this in my 'env_mach_specific':

setenv NETCDF_PATH /gscratch/vsm/ravi/local

I even checked if the LD_LIBRARY_PATH is set on the prompt:

echo $LD_LIBRARY_PATH
/gscratch/vsm/ravi/local/lib::/sw/openmpi/lib:/gscratch/vsm/ravi/libcurl

But still I get the following error while execution. The code runs for about 25 seconds and terminates the job.

./ccsm.exe: error while loading shared libraries: libnetcdff.so.5: cannot open shared object file: No such file or directory

 

Thanks

Ravi

Pennsylvania State University

jedwards

Try putting the LD_LIBRARY_PATH into the env_mach_specific file.    You can also you the -Wl,-rpath flag to the linker by adding it to the Macros file.   Make sure that the libnetcdff.so.5 is in the specified directory.

CESM Software Engineer

ruk15@...

Thank you, I did that and the build failed with the following error:

******************

CCSM PRESTAGE SCRIPT HAS FINISHED SUCCESSFULLY
-------------------------------------------------------------------------
 CCSM BUILDEXE SCRIPT STARTING
 - Build Libraries: mct pio csm_share
Wed Jan 22 07:08:36 PST 2014 /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/mct.bldlog.140122-070810
ERROR: mct.buildlib failed, see /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/mct.bldlog.140122-070810
ERROR: cat /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/mct.bldlog.140122-070810

*************************

I think this is something related to the mpi path. But when I removed the LD_LIBRARY_PATH from env_mach_specific, the build went well (but of course, we are bakc to the original ./ccsm.exe error that libnetcdff.so.5 is not found). Oh, and my libnetcdff.so.5 is in  /gscratch/vsm/ravi/local/lib  directory which is where I was pointing to in the LD path.

Not sure what's happening.

Ravi

Pennsylvania State University

jedwards

Look for the error in the config.log file in the directory  /gscratch/vsm/ravi/OUTPUT/Tidallock//Aqua_Tidallock/mct/

CESM Software Engineer

ruk15@...

Thanks, In the beginning of the  config.log file, it says the following. I did load mpi modules. I have given the complete 'config.log' down below. The 'libimf.so' is the only error (after I included LD_LIBRARY_PATH in 'env_mach_specific'.

 

********************

configure:1624: checking for C compiler version
configure:1627: mpicc --version </dev/null >&5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory

*******************

 

The total config.log file:

 

--------------------------

This file contains any messages produced by compilers while
running configure, to aid debugging if configure makes a mistake.

It was created by MCT configure 2.0, which was
generated by GNU Autoconf 2.59.  Invocation command line was

  $ ./configure CC=mpicc FC=mpif90 F90=mpif90 INCLUDEPATH=-I/include

## --------- ##
## Platform. ##
## --------- ##

hostname = n0365
uname -m = x86_64
uname -r = 2.6.18-371.3.1.el5
uname -s = Linux
uname -v = #1 SMP Mon Nov 11 03:23:58 EST 2013

/usr/bin/uname -p = unknown
/bin/uname -X     = unknown

/bin/arch              = x86_64
/usr/bin/arch -k       = unknown
/usr/convex/getsysinfo = unknown
hostinfo               = unknown
/bin/machine           = unknown
/usr/bin/oslevel       = unknown
/bin/universe          = unknown

PATH: /sw/openmpi-1.6.2_icc-13.0/bin
PATH: /sw/intel/composer_xe_2013.0.079/bin/intel64
PATH: /sw/intel/composer_xe_2013.0.079/mpirt/bin/intel64
PATH: /sw/openmpi-1.6.4_icc-13.1/bin
PATH: /sw/intel/composer_xe_2013.2.146/bin/intel64
PATH: /sw/intel/composer_xe_2013.2.146/mpirt/bin/intel64
PATH: /sw/openmpi/bin
PATH: /sw/local/bin
PATH: /sw/gnu-parallel/bin
PATH: /opt/mx/bin
PATH: /sw/Modules/3.2.9/bin
PATH: /sw/moab/bin
PATH: /sw/moab/sbin
PATH: /sw/torque/bin
PATH: /sw/torque/sbin
PATH: /usr/kerberos/bin
PATH: /usr/lpp/mmfs/bin
PATH: /usr/local/bin
PATH: /bin
PATH: /usr/bin
PATH: /usr/lusers/rkoppara/bin


## ----------- ##
## Core tests. ##
## ----------- ##

configure:1624: checking for C compiler version
configure:1627: mpicc --version </dev/null >&5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1630: $? = 127
configure:1632: mpicc -v </dev/null >&5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1635: $? = 127
configure:1637: mpicc -V </dev/null >&5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1640: $? = 127
configure:1663: checking for C compiler default output file name
configure:1666: mpicc    conftest.c  >&5
mpicc: error while loading shared libraries: libimf.so: cannot open shared object file: No such file or directory
configure:1669: $? = 127
configure: failed program was:
| /* confdefs.h.  */
|
| #define PACKAGE_NAME "MCT"
| #define PACKAGE_TARNAME "mct"
| #define PACKAGE_VERSION "2.0"
| #define PACKAGE_STRING "MCT 2.0"
| #define PACKAGE_BUGREPORT ""
| /* end confdefs.h.  */
|
| int
| main ()
| {
|
|   ;
|   return 0;
| }
configure:1708: error: C compiler cannot create executables
See `config.log' for more details.

## ---------------- ##
## Cache variables. ##
## ---------------- ##

ac_cv_env_ALLCFLAGS_set=
ac_cv_env_ALLCFLAGS_value=
ac_cv_env_AR_set=
ac_cv_env_AR_value=
ac_cv_env_BABELROOT_set=
ac_cv_env_BABELROOT_value=
ac_cv_env_BIT64_set=
ac_cv_env_BIT64_value=
ac_cv_env_CC_set=set
ac_cv_env_CC_value=mpicc
ac_cv_env_CFLAGS_set=
ac_cv_env_CFLAGS_value=
ac_cv_env_COMPILER_ROOT_set=
ac_cv_env_COMPILER_ROOT_value=
ac_cv_env_CPPFLAGS_set=
ac_cv_env_CPPFLAGS_value=
ac_cv_env_ENDIAN_set=
ac_cv_env_ENDIAN_value=
ac_cv_env_F90FLAGS_set=
ac_cv_env_F90FLAGS_value=
ac_cv_env_F90SUFFIX_set=
ac_cv_env_F90SUFFIX_value=
ac_cv_env_F90_set=set
ac_cv_env_F90_value=mpif90
ac_cv_env_FCFLAGS_set=
ac_cv_env_FCFLAGS_value=
ac_cv_env_FC_set=set
ac_cv_env_FC_value=mpif90
ac_cv_env_FORT_SIZE_set=
ac_cv_env_FORT_SIZE_value=
ac_cv_env_FPPFLAGS_set=
ac_cv_env_FPPFLAGS_value=
ac_cv_env_FPP_set=
ac_cv_env_FPP_value=
ac_cv_env_INCLUDEFLAG_set=
ac_cv_env_INCLUDEFLAG_value=
ac_cv_env_INCLUDEPATH_set=set
ac_cv_env_INCLUDEPATH_value=-I/include
ac_cv_env_LDFLAGS_set=
ac_cv_env_LDFLAGS_value=
ac_cv_env_MACHDEFS_set=
ac_cv_env_MACHDEFS_value=
ac_cv_env_MPIF90_set=
ac_cv_env_MPIF90_value=
ac_cv_env_MPIHEADER_set=
ac_cv_env_MPIHEADER_value=
ac_cv_env_MPILIBS_set=
ac_cv_env_MPILIBS_value=
ac_cv_env_OPT_set=
ac_cv_env_OPT_value=
ac_cv_env_REAL8_set=
ac_cv_env_REAL8_value=
ac_cv_env_build_alias_set=
ac_cv_env_build_alias_value=
ac_cv_env_host_alias_set=
ac_cv_env_host_alias_value=
ac_cv_env_target_alias_set=
ac_cv_env_target_alias_value=

## ----------------- ##
## Output variables. ##
## ----------------- ##

ALLCFLAGS=''
AR=''
BABELROOT=''
BIT64=''
CC='mpicc'
CFLAGS=''
COMPILER_ROOT=''
CPPFLAGS=''
CRULE=''
DEFS=''
ECHO_C=''
ECHO_N='-n'
ECHO_T=''
ENDIAN=''
EXEEXT=''
F90='mpif90'
F90FLAGS=''
F90RULE=''
F90RULECPP=''
F90SUFFIX=''
FC='mpif90'
FCFLAGS=''
FORT_SIZE=''
FPP=''
FPPFLAGS=''
INCLUDEFLAG=''
INCLUDEPATH='-I/include'
LDFLAGS=''
LIBOBJS=''
LIBS=''
LTLIBOBJS=''
MACHDEFS=''
MPIF90=''
MPIHEADER=''
MPILIBS=''
MPISERPATH=''
OBJEXT=''
OPT=''
PACKAGE_BUGREPORT=''
PACKAGE_NAME='MCT'
PACKAGE_STRING='MCT 2.0'
PACKAGE_TARNAME='mct'
PACKAGE_VERSION='2.0'
PATH_SEPARATOR=':'
PYTHON=''
PYTHONOPTS=''
REAL8=''
SHELL='/bin/sh'
ac_ct_CC=''
ac_ct_F90=''
bindir='${exec_prefix}/bin'
build_alias=''
datadir='${prefix}/share'
exec_prefix='NONE'
host_alias=''
includedir='${prefix}/include'
infodir='${prefix}/info'
libdir='${exec_prefix}/lib'
libexecdir='${exec_prefix}/libexec'
localstatedir='${prefix}/var'
mandir='${prefix}/man'
oldincludedir='/usr/include'
prefix='NONE'
program_transform_name='s,x,x,'
sbindir='${exec_prefix}/sbin'
sharedstatedir='${prefix}/com'
sysconfdir='${prefix}/etc'
target_alias=''

## ----------- ##
## confdefs.h. ##
## ----------- ##

#define PACKAGE_BUGREPORT ""
#define PACKAGE_NAME "MCT"
#define PACKAGE_STRING "MCT 2.0"
#define PACKAGE_TARNAME "mct"
#define PACKAGE_VERSION "2.0"

configure: exit 77
--------------------------

ruk15@...

At least I solved the build problem now. The issue is that in the 'env_mach_specific', I did the following:

setenv LD_LIBRARY_PATH /gscratch/vsm/ravi/local/lib

The correct syntax is:

setenv LD_LIBRARY_PATH /gscratch/vsm/ravi/local/lib:${LD_LIBRARY_PATH}

Now it builds fine, but gives the following error when the code is executed:

LD_LIBRARY_PATH: Undefined variable.

It does not recognize this variable in the 'env_mach_specific' for some reason.

Thanks

Ravi

 

jedwards

This has more to do with the configuration of your machine than with CESM - I would guess that the environment from your login node is not being exported to your compute nodes and

LD_LIBRARY_PATH is not defined on the compute nodes.   

CESM Software Engineer

Log in or register to post comments

Who's new

  • jwolff
  • tinna.gunnarsdo...
  • sarthak2235@...
  • eolivares@...
  • shubham.gandhi@...