Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

/mct/mct/Makefile': doesn't exist or not a regular file

gourik

Gouri
New Member
Dear Team,

I am trying to port CIME to our HPC cluster. For that, I have created the required cmake files in cime_config/machines/cmake_macros/ directory. config_machine.xml is also attached herewith. Now when I am trying to create a new case using "python ../../E3SM/cime/scripts/create_newcase --case testcase9 --compset X --res f19_g16 --machine param-shivay --compiler intel" command, case dicrectory is getting created , output message recorded in run_op.txt. './case.setup' output is attahced in case.setup_op. './case.build' is giving error "can't copy '/scratch/um_cdac2/met_models/CDACP_Int/E3SM/externals/mct/mct/Makefile': doesn't exist or not a regular file".
Please help in this issue
 

Attachments

  • testcase.zip
    17.5 KB · Views: 0

jedwards

CSEG and Liaisons
Staff member
You will have to ask the E3SM developers, but it looks like you have repeated mct in the path E3SM/externals/mct/mct/Makefile
What is the correct path to that Makefile?
 

gourik

Gouri
New Member
Thank you for quick response. I will post the issue on E3SM. This seems to be the correct path. as Makefile is getting generated at bld/intel/mpich/nodebug/nothreads/mct/gptl/gptl/ path.
Adding output of ./case.setup --debug command just in case it will give more details
Building gptl with output to file /scratch/um_cdac2/met_models/CDACP_Int/e3sm/e3sm_scratch/testcase11/bld/gptl.bldlog.240704-130740
Calling /scratch/um_cdac2/met_models/CDACP_Int/E3SM/share/build/buildlib.gptl
Building mct with output to file /scratch/um_cdac2/met_models/CDACP_Int/e3sm/e3sm_scratch/testcase11/bld/mct.bldlog.240704-130740
Calling /scratch/um_cdac2/met_models/CDACP_Int/E3SM/share/build/buildlib.mct

/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py(175)expect()
-> msg = error_prefix + " " + error_msg
(Pdb) exit
Traceback (most recent call last):
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py", line 707, in run_sub_or_cmd
getattr(mod, subname)(*subargs)
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/share/build/buildlib.mct", line 78, in buildlib
copyifnewer(
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py", line 2561, in copyifnewer
safe_copy(src, dest)
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py", line 1428, in safe_copy
file_util.copy_file(
File "/opt/ohpc/pub/apps/anaconda3/envs/py38/lib/python3.8/distutils/file_util.py", line 104, in copy_file
raise DistutilsFileError(
During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "./case.build", line 267, in
_main_func(doc)
File "./case.build", line 251, in _main_func
success = build.case_build(
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/build.py", line 1329, in case_build
return run_and_log_case_status(functor, cb, caseroot=caseroot)
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py", line 2482, in run_and_log_case_status
rv = func()
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/build.py", line 1313, in
functor = lambda: _case_build_impl(
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/build.py", line 1203, in _case_build_impl
logs = _build_libraries(
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/build.py", line 837, in _build_libraries
run_sub_or_cmd(
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py", line 719, in run_sub_or_cmd
expect(False, "{} FAILED, cat {}".format(cmd, logfile))
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py", line 175, in expect
msg = error_prefix + " " + error_msg
File "/scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py", line 175, in expect
msg = error_prefix + " " + error_msg
File "/opt/ohpc/pub/apps/anaconda3/envs/py38/lib/python3.8/bdb.py", line 88, in trace_dispatch
return self.dispatch_line(frame)
File "/opt/ohpc/pub/apps/anaconda3/envs/py38/lib/python3.8/bdb.py", line 113, in dispatch_line
 

gourik

Gouri
New Member
Dear Team ,

I was able to resolve above mentioned issue. Now,

Getting below error while executing ./case.build while porting E3SM on unsupported machine
make[1]: *** [src/clib/CMakeFiles/pioc.dir/all] Error 2
make: *** [all] Error 2
> /scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/utils.py(175)expect()
-> msg = error_prefix + " " + error_msg
(Pdb) (Pdb) ERROR: /scratch/um_cdac2/met_models/CDACP_Int/E3SM/externals/scorpio/src/clib/pioc_support.cpp(890): error: identifier "_Bool" is undefined
void pioassert(_Bool expression, const char *msg, const char *fname, int line)
^

/scratch/um_cdac2/met_models/CDACP_Int/E3SM/externals/scorpio/src/clib/pioc_support.cpp(3160): error: identifier "NC_64BIT_DATA" is undefined
if ((file->mode & NC_64BIT_DATA) == NC_64BIT_DATA)
Below configuration is used in config_machines.xml

<modules>
<command name="load">netcdf/4.4.1.1/intel</command>
<command name="load">netcdf_c_with_parallel_support_4.3.3.1</command>
<command name="load">netcdf_fortran_with_parallel_support_4.4.0</command>
<command name="load">parallel_hdf5_1.8.21</command>
<command name="load">parallel_netcdf_1.8.1</command>
<!-- <command name="load"> parallel-netcdf/1.12.2-gcc-9.2.0-xpmi</command> -->
<!-- <command name="load">gnu8/8.3.0</command>
<command name="load">mpich/3.3.1</command> -->
<command name="load">pnetcdf/1.11.1</command>
</modules>
</module_system>
<RUNDIR>$CIME_OUTPUT_ROOT/$CASE/run</RUNDIR>
<EXEROOT>$CIME_OUTPUT_ROOT/$CASE/bld</EXEROOT>
<environment_variables>
<env name="OMP_STACKSIZE">256M</env>
<env name="MPI_TYPE_DEPTH">16</env>
<env name="OMPI_CC">mpiicc</env>
<env name="OMPI_CXX">mpiicpc</env>
<env name="OMPI_FC">mpiifort</env>
<env name="NETCDF_PATH">/home/apps/netcdf-4.4.1.1/include</env>
<env name="NETCDF_C_PATH">/home/apps/netcdf-4.4.1.1/include</env>
<env name="NETCDF_FORTRAN_PATH">/home/apps/netcdf-4.4.1.1/include</env>
<env name="PNETCDF_PATH">/opt/ohpc/pub_bk/libs/gnu8/mpich/pnetcdf/1.11.1/include</env>

</environment_variables>


Output of case setup is as below


RUN: /usr/bin/xmllint --xinclude --noout --schema /scratch/um_cdac2/met_models/CDACP_Int/E3SM/cime/CIME/data/config/xml_schemas/entry_id_namelist.xsd /scratch/um_cdac2/met_models/CDACP_Int/E3SM/driver-mct/cime_config/namelist_definition_modelio.xml
FROM: /scratch/um_cdac2/met_models/CDACP_Int/e3sm/e3sm_scratch/testcase18
errput: /scratch/um_cdac2/met_models/CDACP_Int/E3SM/driver-mct/cime_config/namelist_definition_modelio.xml validates

RUN: . /opt/ohpc/admin/lmod/8.0.6/init/sh && module list
FROM: /scratch/um_cdac2/met_models/CDACP_Int/e3sm/e3sm_scratch/testcase18
output: Currently Loaded Modules:
1) autotools 5) spack/0.17 9) pgi/19_10/openmpi/3.1.3/2019 13) netcdf/4.4.1.1/intel 17) parallel_netcdf_1.8.1
2) prun/1.3 6) gnu8/8.3.0 10) cuda/10.1 14) netcdf_c_with_parallel_support_4.3.3.1 18) pnetcdf/1.11.1
3) ohpc 7) mpich/3.3.1 11) pnetcdf/1.11.0/pgi19.10 15) netcdf_fortran_with_parallel_support_4.4.0
4) python3.8/3.8 8) pgi/19_10/pgi/19.10 12) intel/2018.5.274 16) parallel_hdf5_1.8.21


Please help regarding this
 

jedwards

CSEG and Liaisons
Staff member
I'm sorry - this is not an E3SM forum. But it looks like your external libraries and compilers are too old for compatibility. Try newer versions of
gnu compilers, netcdf, pnetcdf and intel compiler - to be honest I can't tell from your module list which compiler you intend to use but they are all at least 5 years old.
 

gourik

Gouri
New Member
Thank you for the reply. Yes, I have posted this on the E3SM forum also. I will update the compilers as suggested.
 
Top