Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Using my own JRA files

My case is using JRA files and runs fine on cheyenne. On our Linux cluster at home, I already have some JRA forcing files and I'd rather use them. Instead, the CESM system wants to download the 1.3_noleap JRA files - and fails:
Code:
Loading input file list: 'Buildconf/cpl.input_data_list'
Client protocol gftp not enabled
Using protocol wget with user anonymous and passwd user@example.edu
Trying to download file: '../inputdata_checksum.dat' to path '/center1/AKWATERS/kate/climate/kshedstrom/Arctic5_JRA.MOM6/run/inputdata_checksum.dat.raw' using WGET protocol.
SUCCESS

Using protocol ftp with user anonymous and passwd user@example.edu
server address ftp.cgd.ucar.edu root path cesm/inputdata
Trying to download file: '../inputdata_checksum.dat' to path '/center1/AKWATERS/kate/climate/kshedstrom/Arctic5_JRA.MOM6/run/inputdata_checksum.dat.raw' using FTP protocol.
Using protocol svn with user  and passwd
Client protocol None not enabled
Checking server ftp://gridanon.cgd.ucar.edu:2811/cesm/inputdata/ with protocol gftp
ERROR: module command modulecmd python purge  failed with message:
(Yes, a blank message)

It succeeded in downloading the dice file, so I'd be good to go if I could get it to use my JRA files. I tried changing the Buildconf and CaseDocs files, but they were overwritting by case.submit, causing it to fail again.

I did have it passing ./check_input_data, but had to do this:
Code:
diff --git a/scripts/lib/CIME/case/check_input_data.py b/scripts/lib/CIME/case/check_input_data.py
index 5a4bf3321..c845d27f8 100644
--- a/scripts/lib/CIME/case/check_input_data.py
+++ b/scripts/lib/CIME/case/check_input_data.py
@@ -318,6 +318,7 @@ def check_input_data(case, protocol="svn", address=None, input_data_root=None, d
                 if(full_path):
                     # expand xml variables
                     full_path = case.get_resolved_value(full_path)
+                    rel_path = full_path
                     if input_ic_root and input_ic_root in full_path \
                        and ic_filepath:
                         rel_path = full_path.replace(input_ic_root, ic_filepath)
 
OK, I fixed it so it finds my JRA files (had to resort to a symbolic link), but I still get:
Code:
ERROR: module command modulecmd python purge  failed with message:
(blank line) from ./case.submit.
 
Here's my config_machines chunk:
Code:
  <machine MACH="chinook">
    <DESC>Chinook Linux Cluster UAF, 24 pes/node, InfiniBand, Relion 1900, batch system Scyld</DESC>
    <OS>LINUX</OS>
    <COMPILERS>gnu,intel</COMPILERS>
    <MPILIBS>openmpi</MPILIBS>
    <CIME_OUTPUT_ROOT>/center1/AKWATERS/kate/climate/$USER</CIME_OUTPUT_ROOT>
    <DIN_LOC_ROOT>/center1/AKWATERS/kate/climate/cesm/inputdata</DIN_LOC_ROOT>
    <DIN_LOC_ROOT_CLMFORC>/center1/AKWATERS/kate/climate/cesm/inputdata/atm/datm7</DIN_LOC_ROOT_CLMFORC>
    <DOUT_S_ROOT>/import/AKWATERS/$USER/archive/$CASE</DOUT_S_ROOT>
    <BASELINE_ROOT>/center1/AKWATERS/kate/climate/cesm/ccsm_baselines</BASELINE_ROOT>
    <CCSM_CPRNC>/center1/AKWATERS/kate/climate/cesm/tools/cprnc/cprnc</CCSM_CPRNC>
    <GMAKE>make</GMAKE>
    <GMAKE_J>4</GMAKE_J>
    <BATCH_SYSTEM>slurm</BATCH_SYSTEM>
    <SUPPORTED_BY>uaf-rcs -at- alaska.edu</SUPPORTED_BY>
    <MAX_TASKS_PER_NODE>24</MAX_TASKS_PER_NODE>
    <MAX_MPITASKS_PER_NODE>24</MAX_MPITASKS_PER_NODE>
    <PROJECT_REQUIRED>TRUE</PROJECT_REQUIRED>
    <mpirun mpilib="openmpi">
      <executable>mpirun</executable>
      <arguments>
      </arguments>
    </mpirun>
    <module_system type="module">
      <init_path lang="sh">/usr/share/Modules/init/sh</init_path>
      <init_path lang="bash">/usr/share/Modules/init/bash</init_path>
      <init_path lang="csh">/usr/share/Modules/init/csh</init_path>
      <init_path lang="python">/usr/share/Modules/init/python.py</init_path>
      <init_path lang="perl">/usr/share/Modules/init/perl.pm</init_path>
      <cmd_path lang="sh">module</cmd_path>
      <cmd_path lang="bash">module</cmd_path>
      <cmd_path lang="csh">module</cmd_path>
      <cmd_path lang="python">modulecmd python</cmd_path>
      <cmd_path lang="perl">modulecmd perl</cmd_path>
      <modules>
        <command name="purge"/>
      </modules>
      <modules compiler="gnu">
        <command name="load">slurm</command>
        <command name="load">toolchain/foss/2019b</command>
      </modules>
      <modules compiler="intel">
        <command name="load">slurm</command>
        <command name="load">toolchain/pic-intel/2016b</command>
        <command name="load">data/netCDF-Fortran/4.4.4-pic-intel-2016b</command>
        <command name="load">geo/ESMF/7.0.0-pic-intel-2016b</command>
        <command name="load">devel/CMake/3.5.2-pic-intel-2016b</command>
      </modules>
    </module_system>
    <environment_variables>
      <env name="OMP_STACKSIZE">64M</env>
    </environment_variables>
    <environment_variables comp_interface="nuopc" DEBUG="FALSE" compiler="gnu">
      <env name="ESMFMKFILE">/import/home/kshedstrom/lib/libO/Linux.gfortran.64.mpiuni.default/esmf.mk</env>
    </environment_variables>
    <environment_variables comp_interface="nuopc" DEBUG="TRUE" compiler="gnu">
      <env name="ESMFMKFILE">/import/home/kshedstrom/lib/libO/Linux.gfortran.64.mpiuni.default/esmf.mk</env>
    </environment_variables>
    <environment_variables comp_interface="nuopc" compiler="intel">
      <env name="ESMFMKFILE">/import/AKWATERS/kshedstrom/src/my-esmf/lib/libO/Linux.gfortran.64.mpiuni.default/esmf.mk</env>
    </environment_variables>
    <environment_variables comp_interface="nuopc">
      <env name="ESMF_RUNTIME_PROFILE">ON</env>
      <env name="ESMF_RUNTIME_PROFILE_OUTPUT">SUMMARY</env>
    </environment_variables>
  </machine>
(Compiling with intel now)
 
Top