Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

How to use my atm foring datasets to replace the default GSWP3 ?

jack

jack
Member
Hi,
Now I can run a regional case in CLM5.0 successsfully by using the default GSWP3 datasets. I want to use my own atmforcing ( it has 1h temporal resolution, and I have processed it the standard format as GSWP3, i.e., monthly precip, solar and tphwl, meanwhile made a domain file for the atm foring) to replace, but I have met some problems:

(1) I just want to run within one month period (i.e., June 2013, because I only have this month data), but I see all of the data in 2013 (i.e., Jan-Dec) are still in need in data.streams files, how should I set up to meet the need?
(2) After some steps, I think I have replaced GSWP3 with my own foring, but the case is still running based on GSWP3 data, even I have changed the specified data path and domain path in the datm.streams.txt.

The specifical process of mine is as follows:


./create_newcase --case my_atmforc --res CLM_USRDAT --compset I2000Clm50SpGs --mach jack --run-unsupported
....
./xmlchange DATM_MODE=CLMGSWP3v1
./xmlchange DATM_CLMNCEP_YR_START=2013
./xmlchange DATM_CLMNCEP_YR_END=2013
./xmlchange STOP_OPTION=ndays
./xmlchange STOP_N=30
./xmlchange RUN_STARTDATE=2013-06-01
./xmlchange STOP_DATE=20130630
...
./case.setup
add :
tintalgo = "nearest", "nearest", "nearest", "nearest", "nearest" to "user_nl_datm"
Then copy the three datm.streams.txt.CLMGSWP3v1.Precip/Solar/TPQW files in CaseDocs to the current case dirctory and revise the some routes (the datm.stream files and user_nl_clm are attached below)
Then ./preview_namelists & build & run
It can successfully run, however, the run is still based on GSWP3 data, I don't know what's wrong with it ?

Any suggestions was highly appreciated, thanks!

(the datm.stream files and user_nl_clm are attached below)
 

Attachments

  • datm.streams.txt.CLMGSWP3v1.Precip.png
    datm.streams.txt.CLMGSWP3v1.Precip.png
    45.4 KB · Views: 32
  • datm.streams.txt.CLMGSWP3v1.Solar.png
    datm.streams.txt.CLMGSWP3v1.Solar.png
    45 KB · Views: 28
  • datm.streams.txt.CLMGSWP3v1.TPQW.png
    datm.streams.txt.CLMGSWP3v1.TPQW.png
    49 KB · Views: 36
  • user_nl_clm.txt
    206 bytes · Views: 9

oleson

Keith Oleson
CSEG and Liaisons
Staff member
When you copy and change the three datm streams that were in CaseDocs to your run directory prepend "user_" on the file names, e.g.,

user_datm.streams.txt.CLMGSWP3v1.Precip
 

jack

jack
Member
When you copy and change the three datm streams that were in CaseDocs to your run directory prepend "user_" on the file names, e.g.,

user_datm.streams.txt.CLMGSWP3v1.Precip
Thanks a lot, this issue has been fixed out. There is another issue, under the condition that my atm forcing is with 1h tempornal resolution (i.e., 720 records in total in one solar file), i see the 'time' deminsion of my solar file must be written by starting with 0 (i.e., time: 0:719) , however this can not fullfile the need that 'nearest' should correspond to the middle of the interval of the data. Do you have any suggesitons?
 

jack

jack
Member
In other word, I see the ‘time’ deminsion in GSWP3 solar file is 0: 0.125: 29.875 and others is 0.0625: 0.125: 29.9375, so do I have to set the maximum value of 'time' diminsion to 30 or 31? In my case (1h resolution, i.e.,720 records in one solar file), I set the 'time' deminsion to 0:719, does it work ?
 

jack

jack
Member
When you copy and change the three datm streams that were in CaseDocs to your run directory prepend "user_" on the file names, e.g.,

user_datm.streams.txt.CLMGSWP3v1.Precip
Dear oleson. could you please help me with the question?
I see the ‘time’ deminsion in GSWP3 solar file is 0: 0.125: 29.875 and others is 0.0625: 0.125: 29.9375, so do I have to set the maximum value of 'time' diminsion to 30 or 31? In my case (1h resolution, i.e.,720 records in one solar file), I set the 'time' deminsion to 0:719, does it work ?
 

oleson

Keith Oleson
CSEG and Liaisons
Staff member
Sorry for the slow response, out on vacation.
Did you solve your problem?
 

oleson

Keith Oleson
CSEG and Liaisons
Staff member
Ok, well I'm not quite sure I understand your question.
But if you want to just run for June (30 days) and you have hourly solar radiation data, then I think your time dimension would be 30 X 24 = 720, and the values of time on the file would be:

0., 0.0416, 0.0833, etc.
 

jack

jack
Member
Ok, well I'm not quite sure I understand your question.
But if you want to just run for June (30 days) and you have hourly solar radiation data, then I think your time dimension would be 30 X 24 = 720, and the values of time on the file would be:

0., 0.0416, 0.0833, etc.
thx a lot
 

jack

jack
Member
Ok, well I'm not quite sure I understand your question.
But if you want to just run for June (30 days) and you have hourly solar radiation data, then I think your time dimension would be 30 X 24 = 720, and the values of time on the file would be:

0., 0.0416, 0.0833, et

Ok, well I'm not quite sure I understand your question.
But if you want to just run for June (30 days) and you have hourly solar radiation data, then I think your time dimension would be 30 X 24 = 720, and the values of time on the file would be:

0., 0.0416, 0.0833, etc.
Hi,oleson, here is one more question, i.e., if the time dimension is 720, is it feasiable to define the "time" on the file with 0, 1, 2 ,3....719 rather than 0, 0.0416 ,0.0833... ? Does it affect the order in which the model reads the forcing data? thanks
 

oleson

Keith Oleson
CSEG and Liaisons
Staff member
I think that would be ok as long as your units attribute for the time dimension had "hours since..." instead of "days since...".
 

jack

jack
Member
I think that would be ok as long as your units attribute for the time dimension had "hours since..." instead of "days since...".
I do not understand the definition that "Solar should be time stamped at he beginning of 3hr period". My atmospheric forcing is also with 3hr resolution, does this just mean the time dimesion of the solar file starts from 0 (e.g., 0, 0.125, 0.25 ....) ? Or it also depends on the structure of the raw solar data? For example, the raw solar data is 3-hourly mean (from −1.5hr to +1.5hr) (I regard the raw solar itself is at the middle of 3hr period) and I set the first time sample to time zero, so the offset should be set to -5400 rather than -10800 ?
Actually, I have met the same problem as this post "Problems with custumed atmospheric forcings in clm5.0". This confuses me a lot
I do see now that you said that CMFD precip " is 3-hourly mean (from −3.0hr to 0.0hr) precipitation rate". And you have the first time sample stamped at time zero, which is at the end of the first time period. GSWP3 is time stamped at the middle of a 3-hour period (in accordance with the User's Guide) But you've used the offset to move CMFD back 1.5 hours so that it is effectively now at the middle of the time period (-1.5 hrs), so I think you have done it correctly.
But that is just my opinion, it's obviously up to you to decide.
 

jack

jack
Member
I think that would be ok as long as your units attribute for the time dimension had "hours since..." instead of "days since...".
I do not understand the definition that "Solar should be time stamped at he beginning of 3hr period". My atmospheric forcing is also with 3hr resolution, does this just mean the time dimesion of the solar file starts from 0 (e.g., 0, 0.125, 0.25 ....) ? Or it also depends on the structure of the raw solar data? For example, the raw solar data is 3-hourly mean (from −1.5hr to +1.5hr) (I regard the raw solar itself is at the middle of 3hr period) and I set the first time sample to time zero, so the offset should be set to -5400 rather than -10800 ?
Actually, I have met the same problem as this post "Problems with custumed atmospheric forcings in clm5.0". This confuses me a lot

do you have any suggsetion? The wind, TBOT and QBOT is Instantaneous value,and I also I set the first time sample of TPQW file to time zero, Do I need to set offset?
 

jack

jack
Member
I do not understand the definition that "Solar should be time stamped at he beginning of 3hr period". My atmospheric forcing is also with 3hr resolution, does this just mean the time dimesion of the solar file starts from 0 (e.g., 0, 0.125, 0.25 ....) ? Or it also depends on the structure of the raw solar data? For example, the raw solar data is 3-hourly mean (from −1.5hr to +1.5hr) (I regard the raw solar itself is at the middle of 3hr period) and I set the first time sample to time zero, so the offset should be set to -5400 rather than -10800 ?
Actually, I have met the same problem as this post "Problems with custumed atmospheric forcings in clm5.0". This confuses me a lot
I also plot a picure of the model output FSDS: (1) -5400 offset ,0.5h time step (2) -10800 offset, 1h time step (3) -10800 offset, 0.5h time step compared with in-situ FSDS (red line) and origional raw solar data (CMFD, green line) , do you think which configuration is reasonable?
 

Attachments

  • 1.png
    1.png
    113.9 KB · Views: 32

oleson

Keith Oleson
CSEG and Liaisons
Staff member
From the User's Guide (1.2.4. Customizing the DATM namelist — ctsm CTSM master documentation)

For coszen the time-stamps of the data should correspond to the beginning of the interval the data is measured for. Either make sure the time-stamps on the datafiles is set this way, or use the offset described above to set it.
For nearest and linear the time-stamps of the data should correspond to the middle of the interval the data is measured for. Either make sure the time-stamps on the datafiles is set this way, or use the offset described above to set it.

My interpretation of this, applied to your data, is that since solar is a mean of -1.5 to 1.5 hours, either set the first time stamp to -5400/86400 = -0.0625 with offset = 0, or set the first time stamp to 0 and the offset = -5400.
For the other variables (TPQWL and precip), the first time stamp should be set to the middle of the interval the data is measured for. If not, then the offset can be used to move the data forward or backward. For example, if precipitation is the average of 0 to 3 hours, then the first time stamp should be 5400/86400 = 0.0625. For instantaneous data, I think the time stamps would be set to the time the data is valid for. E.g., if the first value of temperature is instantaneous at 1.5 hours, then set the first time sample to 0.0625 and offset = 0.
 

jack

jack
Member
From the User's Guide (1.2.4. Customizing the DATM namelist — ctsm CTSM master documentation)

For coszen the time-stamps of the data should correspond to the beginning of the interval the data is measured for. Either make sure the time-stamps on the datafiles is set this way, or use the offset described above to set it.
For nearest and linear the time-stamps of the data should correspond to the middle of the interval the data is measured for. Either make sure the time-stamps on the datafiles is set this way, or use the offset described above to set it.

My interpretation of this, applied to your data, is that since solar is a mean of -1.5 to 1.5 hours, either set the first time stamp to -5400/86400 = -0.0625 with offset = 0, or set the first time stamp to 0 and the offset = -5400.
For the other variables (TPQWL and precip), the first time stamp should be set to the middle of the interval the data is measured for. If not, then the offset can be used to move the data forward or backward. For example, if precipitation is the average of 0 to 3 hours, then the first time stamp should be 5400/86400 = 0.0625. For instantaneous data, I think the time stamps would be set to the time the data is valid for. E.g., if the first value of temperature is instantaneous at 1.5 hours, then set the first time sample to 0.0625 and offset = 0.
Thanks for your reply, with your help,I think I have done correctly, i.e. the first time stamp of all the data was set to 0. For precip and Solar, the offset = -5400 and for TPQW, the offset=0. However, I doubt the time line of CLM out variables was also shifted because of the offset.
For example, the model output 0.5 hourly TBOT was 0.5 hour later than the raw 3 hourly TBOT, the model output 0.5 hourly TG was 1 hour later than in-situ LST. I looked at the atm.log and found the last two files (i.e. 82800 and 84600) was calculated from the next month file. For example, I want to run a period for Jan-Mar 2016, I found the :

(shr_dmodel_readstrm) close : /home/jj/jack/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516/TPHWL/ncl/nclclmforc.Qian.c2006.T62.TPQW.2016-03.nc
(shr_dmodel_readstrm) open : /home/jj/jack/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516/TPHWL/ncl/nclclmforc.Qian.c2006.T62.TPQW.2016-04.nc
(shr_dmodel_readstrm) file ub: /home/jj/jack/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516/TPHWL/ncl/nclclmforc.Qian.c2006.T62.TPQW.2016-04.nc 1
(datm_comp_run) atm: model date 20160331 77400s
(datm_comp_run) atm: model date 20160331 79200s
(datm_comp_run) atm: model date 20160331 81000s
(shr_dmodel_readstrm) file ub: /home/jj/jack/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516/Solar/ncl/nclclmforc.Qian.c2006.T62.Solr.2016-04.nc 2
(shr_dmodel_readstrm) file ub: /home/jj/jack/inputdata/atm/datm7/atm_forcing.datm7.GSWP3.0.5d.v1.c170516/Precip/ncl/nclclmforc.Qian.c2006.T62.Prec.2016-04.nc 2
(datm_comp_run) atm: model date 20160331 82800s
(datm_comp_run) atm: model date 20160331 84600s

(datm_comp_run) writing heihe2016_new.datm.rs1.2016-04-01-00000.bin20160401 0s

Files with date 20160331 82800 84600 were extracted from 2016-04.nc. So I would like to ask whether the model variables (e.g., TG) need to be moved forward for one hour? the screenshot of atm.log was attached below.
 

Attachments

  • 1629444621(1).png
    1629444621(1).png
    127.6 KB · Views: 36

jack

jack
Member
sorry, the the model output 0.5 hourly TG was 1.5 hour later than in-situ LST, so I would like to ask whether the model variables (e.g., TG) need to be moved forward for 1.5 hour? atm.log.txt is attached below
 

Attachments

  • atm.log.210820.txt
    588.9 KB · Views: 1

oleson

Keith Oleson
CSEG and Liaisons
Staff member
If you believe that the forcing data is setup correctly, then I wouldn't recommend moving it around just to get model output variables to match up with whatever in-situ data you might have. There could be many reasons why model output, e.g., TG doesn't agree with in-situ data (e.g., surface characterization, deficiencies in parameterizations). I assume you are accounting for any difference in time zone between the in-situ and model data.
 
Top