Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Guidance on Running with LME datasets with CESM2.1.3 (CLM5)

wvsi3w

wvsi3w
Member
Hello,
I have some questions regarding how to run with LME data and other related inquiries that I would like to know your opinion on:

For my project, I’m using CESM 2.1.3, and I would have a combination of paleo and future scenarios in my project. I would be using data from LME (850 CE onward) and SSPs for GLOBAL and REGIONAL simulations. So it would cover 850 to 2100 (or let's say 2300, using SSP development data). I have my own soil layer structure therefore I will need to do my own spinup as the restart files from other simulations and other projects won't be useful. I am facing a few technical and strategic challenges...

1. Dataset Selection and Compatibility: I plan to use the CCSM4-based LME data (Dataset: CESM1 Last Millennium Ensemble) as atmospheric forcing for the last millennium. However, I am unclear about which specific dataset version (e.g., the “003” or “007” series). As you know in the download link of the atm 6h data there are two types of datasets, for example for the Q variable:
b.e11.BLMTRC5CN.f19_g16.007.cam.h2.Q.0850010100Z-0899123118Z.nc
b.e11.BLMTRC5CN.f19_g16.850forcing.003.cam.h2.Q.0850010100Z-0899123118Z.nc
would be optimal for this purpose and why? Also if this dataset is not compatible with CESM2.1.3 (CLM5) what do you suggest to use instead that covers 850 onward?

2- Are there recommendations on using “6-Hourly Instantaneous” vs. “6-Hourly Averaged” data for climate dynamics over this period?

3- I asked a question here in this thread regarding the process and I would like to know your thoughts on it too, what would be the steps to do a simulation using LME datasets from the beginning (setting up the case, paths, names, spinup, transient, ...), any guideline available for global simulation? I found this link for PaleoResources but I don't think it would be useful since it is for the CLM4 version.

4- Is there a different process for surface dataset generation in single-point mode for using the LME dataset?

4.1- Also (a less related question) regarding the single-point process, when people talk about "using our own forcing dataset" on the forum, they seem to simulate "single-point/regional" jobs. I wondered if it is even possible to use our forcing for global simulations or different resolutions. I mean I think it is possible but why this CLM_USRDAT_NAME method is only mentioned in the single-point simulation process?

5- What kind of vegetation dynamic would be ideal for this type of simulation? Is FATES good for it? Is there any other DV than FATES? any limitations? I am asking because one important part of my project is to have vegetation dynamics analyzed, so knowing this before running the lengthy simulation would be very beneficial and save time and energy.

6- How about coupling CLM with CAM or fully coupled CESM? can you think of a way to have any of the mentioned coupled ways using LME data in a +1000y simulation? This part is also important because some of the atmospheric variables (or circulation/pattern/teleconnection) must be analyzed too, so I will need CAM.

6.1- therefore if you had some similar experience with coupling CLM&CAM or with giving the output of CLM to CAM as initial conditions or I don’t know maybe “forcing” please let me know the general process or guidelines. Or if I am not thinking clearly and maybe this question is not correct also please notify me because I don't know if doing so "giving the output of CLM to CAM (as the initial condition)" is a way to do it. I suspect I need to have a CLM-CAM coupled run (or fully coupled) to achieve the aforementioned objectives of my project, but that's me, I need your advice.

Thanks a lot.
 

wvsi3w

wvsi3w
Member
Dear @jiangzhu I could only find your ID who I think is related to Paleo forum. I would really be grateful if you let me know what you think about these questions (specially question 1 and 2). Thanks a lot
 

wvsi3w

wvsi3w
Member
Dear Scientists,
I have a question about my LME dataset which I downloaded (9TB, 7 variables: U, V, Q, T, PS, PRECT, FLDS, FSDS):

The data was 2deg and since I needed it to be 1deg I did some re-griding (cdo remapbil,r360x180). I noticed that most people are using their own atmospheric forcing to run a single-point or regional case but my question is that can we run it globally? I think we can, but is it a different process for it? because most of the threads I read are doing it for regional studies, and I need to do it globally. If so, may I know which link or documentation I should follow?

Another question is my dataset consist of 23 netcdf files like one for 850-899 another for 900-949 and so on till 1950-2005. So I read in some threads that we need to separate the data into monthly sets or we need to reduce the number of files (forcing years) in stream file; e.g. PRECT/850-01.nc, 850-02.nc, .... 2005-12.nc
Is that right? or can I keep it like that 50 years?

It would be great to know your opinion (@oleson)
 

wvsi3w

wvsi3w
Member
Dear Scientists,
I have a question about my LME dataset which I downloaded (9TB, 7 variables: U, V, Q, T, PS, PRECT, FLDS, FSDS):

The data was 2deg and since I needed it to be 1deg I did some re-griding (cdo remapbil,r360x180). I noticed that most people are using their own atmospheric forcing to run a single-point or regional case but my question is that can we run it globally? I think we can, but is it a different process for it? because most of the threads I read are doing it for regional studies, and I need to do it globally. If so, may I know which link or documentation I should follow?

Another question is my dataset consist of 23 netcdf files like one for 850-899 another for 900-949 and so on till 1950-2005. So I read in some threads that we need to separate the data into monthly sets or we need to reduce the number of files (forcing years) in stream file; e.g. PRECT/850-01.nc, 850-02.nc, .... 2005-12.nc
Is that right? or can I keep it like that 50 years?

It would be great to know your opinion (@oleson)
Dear @slevis and dear @oleson I would be happy and greatful to know your opinion about the question I asked above. Thanks.

The data that I re-grided is more than 50TB (LME 6hourly atmosphere data for the 7 variables that are required to initialize the land model) and I have to prepare it for the model on another system (because of storage limit), then send data (100 years at a time) to the system I will run the model and run it (100 years at a time = 10 continuous 100y = 1000y). Based on the two threads I mentioned in my previous question "to separate the files" and "reduce the number of forcing years" I think I must have the data in the monthly format. I started separating the whole 50TB now. Still, I need to make sure if this is the right way to do it because the storage on my archive system is reaching its limit. After I separate the files into monthly I have to delete the initial 50y files to have enough space to move on to the next step so your opinion here will be extremely influential as it will save a lot of time for me.

I have some more questions too:

1- The Temperature data is available for this LME 6hourly (Dataset: ucar.cgd.ccsm4.cesmLME.atm.proc.6hourly_inst.T), is this the TBOT? because the TS is not TBOT and I think this T should be used (its lowest level=0) (?)

2- All the 6hourly LME data has two types,

b.e11.BLMTRC5CN.f19_g16.007.cam.h2.T.0850010100Z-0899123118Z.nc
and
b.e11.BLMTRC5CN.f19_g16.850forcing.003.cam.h2.T.0850010100Z-0899123118Z.nc

I think the difference is that the second one used the year 850 for spinup (?). I downloaded all 7 variables from the first type of data. I didn't know which type (003 or 007) to choose so I went with the first one. Did I choose right? if not I have to immediately download the other one. I actually asked it in my first question in this thread.

3- When I start the spinup and my transient run for my case using this LME data (850 onwards) what would I set for the CO2 content? because besides those 7 variables that are required to run the clm5, there are CO2 also which I guess I have to make another file for it using the constant value from 850 until 1850 (?)

4- Could you confirm that the procedure to run globally is similar to what is discussed for regional single-point simulation process that people have used with their own atmospheric forcing? Because I don't know why I couldn't find a thread or a guideline for it and seems like everyone have used "running with own atm forcing" for their regional studies !
 

slevis

Moderator
Staff member
I will respond to things that I think I can help with:
3. If I were running 850 to 1850, then i would set the co2 to the known record. For an 850 spin-up, I would use the co2 value for that year.
4. Yes, should be similar since global simulations are the most general case of regional.
 

wvsi3w

wvsi3w
Member
I will respond to things that I think I can help with:
3. If I were running 850 to 1850, then i would set the co2 to the known record. For an 850 spin-up, I would use the co2 value for that year.
4. Yes, should be similar since global simulations are the most general case of regional.
Thank you Sam for your kind response.
While I was preparing the 50TB of data (FSDS, PRECT, TPHLW) I thought of another question:
Since I can not put all of these 1000 years of data in the initial path (due to low storage on my system that I run the model) I thought I could split it into 100y or 50y runs. So after the spin-up of 850 is done, I should start the transient run, then what is the xmlchange option that I must consider? because when the AD is done we do ND with "RUN_TYPE=startup" with pointing to the AD restart file. So, for the transient I should then point to the final spunup file and go with RUN_TYPE=startup?? and when the first 100y is finished I again put the restart of the first 100y transient run in "finidat" and again go with startup type?

I am asking these because I have read somewhere that I probably need the branch type run for this. I am not sure what is correct.
 

slevis

Moderator
Staff member
Use "startup" with CONTINUE = FALSE when you start each phase (such as beginning of AD, beginning of ND, or beginning of transient).

Use "startup" with CONTINUE = TRUE during a phase (such as partway through AD, partway through ND, or partway through transient).
 

wvsi3w

wvsi3w
Member
Use "startup" with CONTINUE = FALSE when you start each phase (such as beginning of AD, beginning of ND, or beginning of transient).

Use "startup" with CONTINUE = TRUE during a phase (such as partway through AD, partway through ND, or partway through transient).
Thanks a lot.
So after my spin-up is finished I should create a new case for my transient run (just like ND process) and point to the rpointers of my final spin-up file and copy the restart of that into my newcase of transient run (which is a startup run)?
Or for transient should I go with other run types (branch, hybrid)?!
 

oleson

Keith Oleson
CSEG and Liaisons
Staff member
For land-only transient, I generally use startup but hybrid should work as well.
 

wvsi3w

wvsi3w
Member
Hello again,
I have finished preparing the ~1000y data from LME CCSM4 (6hourly atmospheric data from 850CE to 2005: Solar + Precip + TPHWL), (the process is that I modified, re-gridded, separated, added metadata, and units, merged, etc), and I divided the data into 100 years parts (due to disk space limitation) and I have to do the following:

1st) I need to check if the 100y data that I prepared will work with a test run using IHistClm50BgcCrop compset (1deg). So, for this, I will create a new case and I think I should modify the path to the directories of Solar + Precip + TPHWL using a user_nl_clm option (???) Or should I remove the data inside the directories of the case that I will create, and put my Solar + Precip + TPHWL data inside that? (if this is the way, then I should wait until my current simulation is finished because if I remove the inputdata it will fail). How about the domain file or the other files like CO2,...

So the question is "What would be the process of testing my 100y data that I prepared with a IHistClm50BgcCrop compset (1deg)?" Imagine I have a 100y 6hourly data ready for the Solar + Precip + TPHWL with the name of the files in one of the directories being like this "85001.nc 85002.nc 85003.nc ,... 94912.nc".

Moreover, I checked the domain files available for the 1deg cases and as we all know the resolutions are 0.9*1.25 but my 100y data is exactly one degree (1*1). So I think I should modify the domain and grid and maps and masks? (I don't know how, and where to begin).

I have read every related thread on the forum regarding these questions and unfortunately, I must admit I don't get it. I don't know how to do it step by step.
I have also read this documentation (and this one too) but still not sure what to do. Maybe I have a reading disability :( sorry for any inconvenience.

2nd) I need to do another spin-up using the LME dataset by giving 850-851 as the year for spin-up and do this for about 500 years (because the current AD+ND spin-up is for 1901 and using 850CE data we need another spin-up using data from that time period). For this part, if I know the answer to my previous questions in part "1st" I would probably be ready to this part too. All I need to do is to give the restart file from my spunup AD+ND simulation as finidat for the 850CE spin-up (???)

3rd) Then I need to start the transient run from 850 onwards and due to the lack of space on the systems I am running, I need to do it 100y at a time. So is there a heads-up alarm for this part that I must know before ruining everything? I mean in the structure of running, the XML options, settings, etc.

-- I need to state the fact that this is a global simulation (1x1) and I think some things are not mentioned in the documentation of "Creating input for surface dataset generation" regarding the global simulations, as far as I understood those documentations are related to regional cases (????)

-- Thank you @slevis and @oleson for your time and support.
 

oleson

Keith Oleson
CSEG and Liaisons
Staff member
1st) Basically, the datm is controlled by these files (at least in release-cesm2.1.5) that can be found in the CaseDocs directory in your case directory for a transient run:

datm_in
datm.streams.txt.CLMGSWP3v1.Precip
datm.streams.txt.CLMGSWP3v1.Solar
datm.streams.txt.CLMGSWP3v1.TPQW
datm.streams.txt.co2tseries.20tr
datm.streams.txt.presaero.trans_1850-2000
datm.streams.txt.topo.observed

To use your own atmospheric forcing, you need to modify those files, in particular, for 6-hourly meteorological forcing, the first four files.
You do this by copying those files into your case directory and pre-pending "user_" to the file names.
A snippet of the Precip file should look like this by default (GSWP3 is the default atmospheric forcing):

<?xml version="1.0"?>
<file id="stream" version="1.0">
<dataSource>
GENERIC
</dataSource>
<domainInfo>
<variableNames>
time time
xc lon
yc lat
area area
mask mask
</variableNames>
<filePath>
/glade/campaign/cesm/cesmdata/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516
</filePath>
<fileNames>
domain.lnd.360x720_gswp3.0v1.c170606.nc
</fileNames>
</domainInfo>
<fieldInfo>
<variableNames>
PRECTmms precn
</variableNames>
<filePath>
/glade/campaign/cesm/cesmdata/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516/Precip
</filePath>
<fileNames>
clmforc.GSWP3.c2011.0.5x0.5.Prec.1901-01.nc
clmforc.GSWP3.c2011.0.5x0.5.Prec.1901-02.nc
...

So you'll need to replace this with your own domain file, path to your forcing files, and your forcing files.
To create a domain file, as detailed in the link you shared, you basically need to create a SCRIP grid file and a SCRIP mapping file that can be used as input to gen_domain. Generally speaking, you could use components/clm/tools/mkmapgrids/components/clm/tools/mkmapgrids/mkscripgrid.ncl to create the SCRIP grid file, then cime/tools/mapping/gen_mapping_files/gen_ESMF_mapping_file/create_ESMF_map.sh to create the mapping file, and then cime/tools/mapping/gen_domain_files/gen_domain to create the domain file. Another option is to create a domain file using an offline script following the structure of the GSWP3 domain file example given above.

2nd) Yes
3rd) That's not a normal mode of operation so I can't be sure what problems you might run into. Only thing I can think of is that the datm usually needs a time-slice of atmospheric forcing from Dec of a prior year or Jan of the next year to do interpolation, so you may want to add another month or year of data before each 100 year period.
 

wvsi3w

wvsi3w
Member
1st) Basically, the datm is controlled by these files (at least in release-cesm2.1.5) that can be found in the CaseDocs directory in your case directory for a transient run:

datm_in
datm.streams.txt.CLMGSWP3v1.Precip
datm.streams.txt.CLMGSWP3v1.Solar
datm.streams.txt.CLMGSWP3v1.TPQW
datm.streams.txt.co2tseries.20tr
datm.streams.txt.presaero.trans_1850-2000
datm.streams.txt.topo.observed

To use your own atmospheric forcing, you need to modify those files, in particular, for 6-hourly meteorological forcing, the first four files.
You do this by copying those files into your case directory and pre-pending "user_" to the file names.
A snippet of the Precip file should look like this by default (GSWP3 is the default atmospheric forcing):

<?xml version="1.0"?>
<file id="stream" version="1.0">
<dataSource>
GENERIC
</dataSource>
<domainInfo>
<variableNames>
time time
xc lon
yc lat
area area
mask mask
</variableNames>
<filePath>
/glade/campaign/cesm/cesmdata/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516
</filePath>
<fileNames>
domain.lnd.360x720_gswp3.0v1.c170606.nc
</fileNames>
</domainInfo>
<fieldInfo>
<variableNames>
PRECTmms precn
</variableNames>
<filePath>
/glade/campaign/cesm/cesmdata/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516/Precip
</filePath>
<fileNames>
clmforc.GSWP3.c2011.0.5x0.5.Prec.1901-01.nc
clmforc.GSWP3.c2011.0.5x0.5.Prec.1901-02.nc
...

So you'll need to replace this with your own domain file, path to your forcing files, and your forcing files.
To create a domain file, as detailed in the link you shared, you basically need to create a SCRIP grid file and a SCRIP mapping file that can be used as input to gen_domain. Generally speaking, you could use components/clm/tools/mkmapgrids/components/clm/tools/mkmapgrids/mkscripgrid.ncl to create the SCRIP grid file, then cime/tools/mapping/gen_mapping_files/gen_ESMF_mapping_file/create_ESMF_map.sh to create the mapping file, and then cime/tools/mapping/gen_domain_files/gen_domain to create the domain file. Another option is to create a domain file using an offline script following the structure of the GSWP3 domain file example given above.

2nd) Yes
3rd) That's not a normal mode of operation so I can't be sure what problems you might run into. Only thing I can think of is that the datm usually needs a time-slice of atmospheric forcing from Dec of a prior year or Jan of the next year to do interpolation, so you may want to add another month or year of data before each 100 year period.
Dear Keith,
Thank you very much for your helpful response.

I copied the first four files from CaseDocs of a case I had before into my new case (TEST_LME) and renamed all of them:
"user_datm_in user_datm.streams.txt.CLMGSWP3v1.Precip user_datm.streams.txt.CLMGSWP3v1.Solar user_datm.streams.txt.CLMGSWP3v1.TPQW"

Then I changed the user_datm_in streams like this:
Code:
&datm_nml
  decomp = "1d"
  factorfn = "null"
  force_prognostic_true = .false.
  iradsw = 1
  presaero = .true.
  restfilm = "undefined"
  restfils = "undefined"
  wiso_datm = .false.
/
&shr_strdata_nml
  datamode = "CLMNCEP"
  domainfile = "/home/USER/projects/def-X/USER/inputdata/share/domains/domain.lnd.fv0.9x1.25_gx1v7.151020.nc"
  dtlimit = 1.5, 1.5, 1.5, 1.5, 1.5, 1.5
  fillalgo = "nn", "nn", "nn", "nn", "nn", "nn"
  fillmask = "nomask", "nomask", "nomask", "nomask", "nomask", "nomask"
  fillread = "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET"
  fillwrite = "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET"
  mapalgo = "bilinear", "bilinear", "bilinear", "bilinear", "bilinear", "nn"
  mapmask = "nomask", "nomask", "nomask", "nomask", "nomask", "nomask"
  mapread = "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET"
  mapwrite = "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET", "NOT_SET"
  readmode = "single", "single", "single", "single", "single", "single"
  streams = "user_datm.streams.txt.CLMGSWP3v1.Solar 850 850 851",
      "user_datm.streams.txt.CLMGSWP3v1.Precip 850 850 851",
      "user_datm.streams.txt.CLMGSWP3v1.TPQW 850 850 851",
      "datm.streams.txt.presaero.trans_1850-2000 1849 1849 2014",
      "datm.streams.txt.topo.observed 1 1 1",
      "datm.streams.txt.co2tseries.20tr 1850 1850 2014"
  taxmode = "cycle", "cycle", "cycle", "cycle", "cycle", "extend"
  tintalgo = "coszen", "nearest", "linear", "linear", "lower", "linear"
  vectors = "null"
/

And as you said I also need to edit the domain file path for the 4 files and also change the filepath and filenames for the three forcing files.
So I was reading this part of the SCRIP and it says this is for regional or single-point runs. However, because I am going to use 1x1 resolution I think I should go through that mkscripgrid.ncl tool, right? The README of mkscripgrid.ncl says the following:

Code:
NCL script to create a SCRIP grid file for a regular lat/lon grid.

To use the script, set the following environment variables

Required (or defaults to a single point over Boulder Colorado)

PTNAME    ! name of your grid
S_LAT     ! Southern latitude corner
N_LAT     ! Northern latitude corner
E_LON     ! Eastern longitude corner
W_LON     ! Western longitude corner

Optional:

NX        ! Number of grid points along longitude (default 1)
NY        ! Number of grid points along latitude (default 1)
IMASK     ! 0 or 1, mask to use if all points are active or not (default active)
PRINT     ! TRUE/FALSE do extra verbose printing or not (default FALSE)
GRIDFILE  ! Output filename

Based on the fact that I want to do global simulation for now, should I only use NX and NY with this tool? But it says NX and NY are optional and the Required ones are the variables for regional cases. So, I assume I just need to load the ncl module and run the tool like this ./mkscripgrid.ncl -NX=360 -NY=180 (???) But it doesn't seem ok since it is saying setting the regional variables is required in this tool. I don't know. I also see that for this tool mknocnmap.pl the regional variables are required which makes me wonder if I can use this tool for my global case. I guess I can't use the mapping tool and domain generation tool without doing the first step. May I ask what should I do for my case?

Thanks again.

P.S. I almost forgot to ask about the transient run, in a spin-up we set the forcing years (I did set it for the year 850 and 851 in my current testing of LME) but what about transient? I mean when I am going to run a transient from 850 to 949 (100y) I should change the stream and files names in these 4 essential files to something like this (???):

user_datm.streams.txt.CLMGSWP3v1.Precip 850 850 949

AND

<fileNames>
85001.nc
85002.nc
.
.
.
94912.nc
 

oleson

Keith Oleson
CSEG and Liaisons
Staff member
The user_* files are used by the namelist script to generate the namelist files, if you want to change the default namelists. The namelist files themselves in your CaseDocs directory won't have user_* prepended to them. So don't put user_* in the user_nl_datm. They should remain datm.streams.txt.*. And you only need to put the lines that are different into user_nl_datm, e.g., your domain file path/name, you don't need to repeat the entire file contents.
The ncl script is setup for a single point (Boulder, CO), so NX and NY are 1 by default and thus are "optional".
For your global 1deg grid, you'll need to set those. So, NX=360 and NY=180.
You can control the years of meteorological forcing that appears in the datm.streams.txt.* files using the following in env_run.xml

DATM_CLMNCEP_YR_START
DATM_CLMNCEP_YR_ALIGN
DATM_CLMNCEP_YR_END

Also, keep in mind that you'll have figure out how you want to deal with the aerosol and CO2 streams.
 

wvsi3w

wvsi3w
Member
The user_* files are used by the namelist script to generate the namelist files, if you want to change the default namelists. The namelist files themselves in your CaseDocs directory won't have user_* prepended to them. So don't put user_* in the user_nl_datm. They should remain datm.streams.txt.*. And you only need to put the lines that are different into user_nl_datm, e.g., your domain file path/name, you don't need to repeat the entire file contents.
The ncl script is setup for a single point (Boulder, CO), so NX and NY are 1 by default and thus are "optional".
For your global 1deg grid, you'll need to set those. So, NX=360 and NY=180.
You can control the years of meteorological forcing that appears in the datm.streams.txt.* files using the following in env_run.xml

DATM_CLMNCEP_YR_START
DATM_CLMNCEP_YR_ALIGN
DATM_CLMNCEP_YR_END

Also, keep in mind that you'll have figure out how you want to deal with the aerosol and CO2 streams.
Dear Keith, hello again.
Thank you for your response
sorry for the late response I had to take care of my other project in the meantime.

I removed the user* files from the CaseDocs of my case (TEST_LME) and put them in the case directory.

The first part of this process as I understood was to create the SCRIP grid file.
this is the path to that ncl tool (my_cesm_sandbox/components/clm/tools/mkmapgrids/) for me.
I had to set the following before running this "ncl mkscripgrid.ncl" :
Code:
export PTNAME="global_1x1"
export S_LAT="-90"
export N_LAT="90"
export E_LON="360"
export W_LON="0"
export NX="360"
export NY="180"
export IMASK="1"
export GRIDFILE="SCRIPgrid_1x1_nomask.nc"

It created this file "SCRIPgrid_1x1_nomask.nc" which the header is like this:
Code:
ncdump -h SCRIPgrid_1x1_nomask.nc
netcdf SCRIPgrid_1x1_nomask {
dimensions:
        grid_size = 64800 ;
        grid_corners = 4 ;
        grid_rank = 2 ;
variables:
        int grid_dims(grid_rank) ;
        double grid_center_lat(grid_size) ;
                grid_center_lat:units = "degrees" ;
                grid_center_lat:_FillValue = 9.96920996838687e+36 ;
        double grid_center_lon(grid_size) ;
                grid_center_lon:units = "degrees" ;
                grid_center_lon:_FillValue = 9.96920996838687e+36 ;
        int grid_imask(grid_size) ;
                grid_imask:units = "unitless" ;
        double grid_corner_lat(grid_size, grid_corners) ;
                grid_corner_lat:units = "degrees" ;
        double grid_corner_lon(grid_size, grid_corners) ;
                grid_corner_lon:units = "degrees" ;

// global attributes:
                :date_created = "Sun Dec 22 22:15:42 EST 2024" ;
                :Createdby = "ESMF_regridding.ncl" ;
                :Conventions = "SCRIP" ;
                :title = "SCRIP grid file for global_1x1" ;
                :history = "Sun Dec 22 22:15:42 EST 2024: create using mkscripgrid.ncl" ;
                :comment = "Ocean is assumed to non-existant at this point" ;
                :Version = "release-clm5.0.30" ;
}

The next step was to use "mkmapdata.sh" (?) but it fails.
Code:
./mkmapdata.sh --gridfile /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc --res 1x1
./mkmapdata.sh
Script to create mapping files required by mksurfdata_map
query command is ./../../bld/queryDefaultNamelist.pl -silent -namelist clmexp  -justvalue -options sim_year=2000 -csmdata /glade/p/cesm/cseg/inputdata

Using user specified scrip grid file: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc
Output grid resolution is 1x1
Hostname = narval2.narval.calcul.quebec
Machine narval2.narval.calcul.quebec NOT recognized
Path to ESMF binary directory does NOT exist:
Set the environment variable: ESMFBIN_PATH

and even after setting "export ESMFBIN_PATH=/cvmfs/soft.computecanada.ca/easybuild/software/2020/avx2/MPI/intel2020/openmpi4/esmf/8.2.0/bin" it fails:
Code:
./mkmapdata.sh --gridfile /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc --res 1x1
./mkmapdata.sh
Script to create mapping files required by mksurfdata_map
query command is ./../../bld/queryDefaultNamelist.pl -silent -namelist clmexp  -justvalue -options sim_year=2000 -csmdata /glade/p/cesm/cseg/inputdata

Using user specified scrip grid file: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc
Output grid resolution is 1x1
Hostname = narval2.narval.calcul.quebec
Machine narval2.narval.calcul.quebec NOT recognized
rm: cannot remove 'PET*.Log': No such file or directory
Creating mapping file: map_0.5x0.5_AVHRR_to_1x1_nomask_aave_da_c241222.nc
From input grid: /glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/grids/SCRIPgrid_0.5x0.5_AVHRR_c110228.nc
For output grid: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc

Input grid file does NOT exist: /glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/grids/SCRIPgrid_0.5x0.5_AVHRR_c110228.nc

Its either the machine that it doesn't recognize or I am doing something wrong. I don't know what to do.

Also, since my data is really 1x1 res, and I will need to do a spin up for it using a restart file from another spinup (1400y of 0.9x1.25 res) will it be problematic since my new LME data is 1x1 and the restart file from that previous run was 0.9x1.25? I don't know if my question is making sense but I felt like maybe it be a problem that I might need to consider now.
 

wvsi3w

wvsi3w
Member
Dear Keith, hello again.
Thank you for your response
sorry for the late response I had to take care of my other project in the meantime.

I removed the user* files from the CaseDocs of my case (TEST_LME) and put them in the case directory.

The first part of this process as I understood was to create the SCRIP grid file.
this is the path to that ncl tool (my_cesm_sandbox/components/clm/tools/mkmapgrids/) for me.
I had to set the following before running this "ncl mkscripgrid.ncl" :
Code:
export PTNAME="global_1x1"
export S_LAT="-90"
export N_LAT="90"
export E_LON="360"
export W_LON="0"
export NX="360"
export NY="180"
export IMASK="1"
export GRIDFILE="SCRIPgrid_1x1_nomask.nc"

It created this file "SCRIPgrid_1x1_nomask.nc" which the header is like this:
Code:
ncdump -h SCRIPgrid_1x1_nomask.nc
netcdf SCRIPgrid_1x1_nomask {
dimensions:
        grid_size = 64800 ;
        grid_corners = 4 ;
        grid_rank = 2 ;
variables:
        int grid_dims(grid_rank) ;
        double grid_center_lat(grid_size) ;
                grid_center_lat:units = "degrees" ;
                grid_center_lat:_FillValue = 9.96920996838687e+36 ;
        double grid_center_lon(grid_size) ;
                grid_center_lon:units = "degrees" ;
                grid_center_lon:_FillValue = 9.96920996838687e+36 ;
        int grid_imask(grid_size) ;
                grid_imask:units = "unitless" ;
        double grid_corner_lat(grid_size, grid_corners) ;
                grid_corner_lat:units = "degrees" ;
        double grid_corner_lon(grid_size, grid_corners) ;
                grid_corner_lon:units = "degrees" ;

// global attributes:
                :date_created = "Sun Dec 22 22:15:42 EST 2024" ;
                :Createdby = "ESMF_regridding.ncl" ;
                :Conventions = "SCRIP" ;
                :title = "SCRIP grid file for global_1x1" ;
                :history = "Sun Dec 22 22:15:42 EST 2024: create using mkscripgrid.ncl" ;
                :comment = "Ocean is assumed to non-existant at this point" ;
                :Version = "release-clm5.0.30" ;
}

The next step was to use "mkmapdata.sh" (?) but it fails.
Code:
./mkmapdata.sh --gridfile /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc --res 1x1
./mkmapdata.sh
Script to create mapping files required by mksurfdata_map
query command is ./../../bld/queryDefaultNamelist.pl -silent -namelist clmexp  -justvalue -options sim_year=2000 -csmdata /glade/p/cesm/cseg/inputdata

Using user specified scrip grid file: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc
Output grid resolution is 1x1
Hostname = narval2.narval.calcul.quebec
Machine narval2.narval.calcul.quebec NOT recognized
Path to ESMF binary directory does NOT exist:
Set the environment variable: ESMFBIN_PATH

and even after setting "export ESMFBIN_PATH=/cvmfs/soft.computecanada.ca/easybuild/software/2020/avx2/MPI/intel2020/openmpi4/esmf/8.2.0/bin" it fails:
Code:
./mkmapdata.sh --gridfile /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc --res 1x1
./mkmapdata.sh
Script to create mapping files required by mksurfdata_map
query command is ./../../bld/queryDefaultNamelist.pl -silent -namelist clmexp  -justvalue -options sim_year=2000 -csmdata /glade/p/cesm/cseg/inputdata

Using user specified scrip grid file: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc
Output grid resolution is 1x1
Hostname = narval2.narval.calcul.quebec
Machine narval2.narval.calcul.quebec NOT recognized
rm: cannot remove 'PET*.Log': No such file or directory
Creating mapping file: map_0.5x0.5_AVHRR_to_1x1_nomask_aave_da_c241222.nc
From input grid: /glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/grids/SCRIPgrid_0.5x0.5_AVHRR_c110228.nc
For output grid: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc

Input grid file does NOT exist: /glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/grids/SCRIPgrid_0.5x0.5_AVHRR_c110228.nc

Its either the machine that it doesn't recognize or I am doing something wrong. I don't know what to do.

Also, since my data is really 1x1 res, and I will need to do a spin up for it using a restart file from another spinup (1400y of 0.9x1.25 res) will it be problematic since my new LME data is 1x1 and the restart file from that previous run was 0.9x1.25? I don't know if my question is making sense but I felt like maybe it be a problem that I might need to consider now.
I should add that I have read related posts and as most of the solutions are to load ESMF I already did load that but still failed.
And since I am not doing this for a single point grid and it is global, so mknoocnmap doesn't apply for my case.

I am using Canadian systems and don't have access to Cheyenne or other directories you have so I guess this explanation of Erik applies here but honestly I don't know if I understood it completely and what exactly should someone in my position do to go through all the steps needed.
 

wvsi3w

wvsi3w
Member
Dear Keith, hello again.
Thank you for your response
sorry for the late response I had to take care of my other project in the meantime.

I removed the user* files from the CaseDocs of my case (TEST_LME) and put them in the case directory.

The first part of this process as I understood was to create the SCRIP grid file.
this is the path to that ncl tool (my_cesm_sandbox/components/clm/tools/mkmapgrids/) for me.
I had to set the following before running this "ncl mkscripgrid.ncl" :
Code:
export PTNAME="global_1x1"
export S_LAT="-90"
export N_LAT="90"
export E_LON="360"
export W_LON="0"
export NX="360"
export NY="180"
export IMASK="1"
export GRIDFILE="SCRIPgrid_1x1_nomask.nc"

It created this file "SCRIPgrid_1x1_nomask.nc" which the header is like this:
Code:
ncdump -h SCRIPgrid_1x1_nomask.nc
netcdf SCRIPgrid_1x1_nomask {
dimensions:
        grid_size = 64800 ;
        grid_corners = 4 ;
        grid_rank = 2 ;
variables:
        int grid_dims(grid_rank) ;
        double grid_center_lat(grid_size) ;
                grid_center_lat:units = "degrees" ;
                grid_center_lat:_FillValue = 9.96920996838687e+36 ;
        double grid_center_lon(grid_size) ;
                grid_center_lon:units = "degrees" ;
                grid_center_lon:_FillValue = 9.96920996838687e+36 ;
        int grid_imask(grid_size) ;
                grid_imask:units = "unitless" ;
        double grid_corner_lat(grid_size, grid_corners) ;
                grid_corner_lat:units = "degrees" ;
        double grid_corner_lon(grid_size, grid_corners) ;
                grid_corner_lon:units = "degrees" ;

// global attributes:
                :date_created = "Sun Dec 22 22:15:42 EST 2024" ;
                :Createdby = "ESMF_regridding.ncl" ;
                :Conventions = "SCRIP" ;
                :title = "SCRIP grid file for global_1x1" ;
                :history = "Sun Dec 22 22:15:42 EST 2024: create using mkscripgrid.ncl" ;
                :comment = "Ocean is assumed to non-existant at this point" ;
                :Version = "release-clm5.0.30" ;
}

The next step was to use "mkmapdata.sh" (?) but it fails.
Code:
./mkmapdata.sh --gridfile /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc --res 1x1
./mkmapdata.sh
Script to create mapping files required by mksurfdata_map
query command is ./../../bld/queryDefaultNamelist.pl -silent -namelist clmexp  -justvalue -options sim_year=2000 -csmdata /glade/p/cesm/cseg/inputdata

Using user specified scrip grid file: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc
Output grid resolution is 1x1
Hostname = narval2.narval.calcul.quebec
Machine narval2.narval.calcul.quebec NOT recognized
Path to ESMF binary directory does NOT exist:
Set the environment variable: ESMFBIN_PATH

and even after setting "export ESMFBIN_PATH=/cvmfs/soft.computecanada.ca/easybuild/software/2020/avx2/MPI/intel2020/openmpi4/esmf/8.2.0/bin" it fails:
Code:
./mkmapdata.sh --gridfile /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc --res 1x1
./mkmapdata.sh
Script to create mapping files required by mksurfdata_map
query command is ./../../bld/queryDefaultNamelist.pl -silent -namelist clmexp  -justvalue -options sim_year=2000 -csmdata /glade/p/cesm/cseg/inputdata

Using user specified scrip grid file: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc
Output grid resolution is 1x1
Hostname = narval2.narval.calcul.quebec
Machine narval2.narval.calcul.quebec NOT recognized
rm: cannot remove 'PET*.Log': No such file or directory
Creating mapping file: map_0.5x0.5_AVHRR_to_1x1_nomask_aave_da_c241222.nc
From input grid: /glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/grids/SCRIPgrid_0.5x0.5_AVHRR_c110228.nc
For output grid: /home/USER/my_cesm_sandbox/components/clm/tools/mkmapgrids/SCRIPgrid_1x1_nomask.nc

Input grid file does NOT exist: /glade/p/cesm/cseg/inputdata/lnd/clm2/mappingdata/grids/SCRIPgrid_0.5x0.5_AVHRR_c110228.nc

Its either the machine that it doesn't recognize or I am doing something wrong. I don't know what to do.

Also, since my data is really 1x1 res, and I will need to do a spin up for it using a restart file from another spinup (1400y of 0.9x1.25 res) will it be problematic since my new LME data is 1x1 and the restart file from that previous run was 0.9x1.25? I don't know if my question is making sense but I felt like maybe it be a problem that I might need to consider now.
Dear Keith @oleson
I hope you are doing well.
I would be grateful to know your opinion on my previous questions regarding the error that shows up even after I load ESMF.
Thank you.
 
Top