Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

How are fixed SST inputs generated?

jrvb

Rob von Behren
New Member
I'm curious about how the SST input files for compsets like F2000climo and F2010climo are generated. (eg, files like $DIN_LOC_ROOT/atm/cam/sst/sst_HadOIBl_bc_0.9x1.25_2010climo_c180511.nc)

I remember reading somewhere that these are created from measured SST data for the years surrounding the target year, but I can't find any information on how this is typically done, whether local temperature peaks and troughs are represented or get averaged away, etc. Does anyone have a pointer to information on how this is done?

Thanks much!

-Rob
 

zarzycki

New Member
Hi Rob,

This is not *exactly* my expertise, but please see: $CESMROOT/components/cam/tools/icesst/README.

Essentially, the climatological file contains the mean state of the monthly mean SST and ice over some reference period (in years). In the DOCN model, the actual SST at a given timestep is linearly interpolated between the two bounding states (i.e., June 28 uses some weighted combination of the June 15 and August 15 averages). There is an "adjustment" process that is done to ensure that the monthly mean after performing this linear interpolation actually matches the monthly mean in reality.

It is worth noting that the file you mention will not contain short-term variations in SST in either time or space (e.g., no TC cold wakes, no diurnal cycle, no discrete mesoscale eddies, etc.).

Further, performing an ncdump on the above file will give you the reference year range:

Code:
(base) zarzycki@cheyenne2:~> ncdump -h /glade/p/cesmdata/inputdata/atm/cam/sst/sst_HadOIBl_bc_0.9x1.25_2010climo_c180511.nc
netcdf sst_HadOIBl_bc_0.9x1.25_2010climo_c180511 {
dimensions:
    lon = 288 ;
    lat = 192 ;
    time = UNLIMITED ; // (12 currently)
variables:
    int date(time) ;
        date:long_name = "current date (YYYYMMDD)" ;
    int datesec(time) ;
        datesec:long_name = "current seconds of current date" ;
    double lon(lon) ;
        lon:long_name = "longitude" ;
        lon:units = "degrees_east" ;
    double lat(lat) ;
        lat:long_name = "latitude" ;
        lat:units = "degrees_north" ;
    double time(time) ;
        time:units = "days since 0000-01-01 00:00:00" ;
        time:calendar = "365_day" ;
    float ice_cov(time, lat, lon) ;
        ice_cov:long_name = "BCS Pseudo Sea-ice concentration" ;
        ice_cov:units = "fraction" ;
    float ice_cov_prediddle(time, lat, lon) ;
        ice_cov_prediddle:long_name = "Sea-ice concentration before time diddling" ;
        ice_cov_prediddle:units = "fraction" ;
    float SST_cpl(time, lat, lon) ;
        SST_cpl:long_name = "BCS Pseudo SST" ;
        SST_cpl:units = "deg_C" ;
    float SST_cpl_prediddle(time, lat, lon) ;
        SST_cpl_prediddle:long_name = "SST before time diddling" ;
        SST_cpl_prediddle:units = "deg_C" ;

// global attributes:
        :history = "N/A" ;
        :data_mods = "N/A" ;
        :climo_years = "2005-2015" ;
        :data_reference = "Hurrell et al, 2008: A New Sea Surface Temperature and Sea Ice Boundary Dataset for the Community Atmosphere Model, J. Clim., 21, 5145-5153" ;
        :data_doi = "N/A" ;
        :data_source_url = "via dennis shea ftp://ftp.cgd.ucar.edu/archive/SSTICE/" ;
        :data_script = "regrid and bcgen under model tools" ;
        :data_creator = "Julie Caron, jcaron@ucar.edu" ;
        :cesm_contact = "Cecile Hannay, hannay@ucar.edu" ;
        :data_description = "SST and ICE boundary dataset created from merged Reynolds/HADISST products, as in Hurrell et al 2008 (see data reference below) via Dennis Shea. Products have been regridded and time interpolated using Karl Taylor\'s time interpolation scheme (bcgen). Climatologies are computed in the bcgen code." ;
        :data_summary = "Climatological SST and ICE boundary dataset for CAM" ;
        :creation_date = "Fri May 11 18:57:41 MDT 2018" ;

-Colin
 

jrvb

Rob von Behren
New Member
This is very helpful - thanks Colin! I hadn't noticed before that the file metadata includes a paper reference; that's a very handy tip. :)
I'm still unclear on how the averaging is done across years. For example, a naive approach would be to take each grid cell from 2005-2015 (the date range in the file metadata) and average the January values to get the January values for the fixed SST file. That will tend to give an SST file with no extremes, though, which would look quite different from any real year.

I see in the Hurrell et al paper that they are careful to try to keep these sorts of local extremes when bringing together different data sources (with the caveat of course that the monthly averages will also tend to damp out extremes that exist on shorter time scales...) I can't find any description of what how the 11 years of input are condensed down to 1 year for the SST data file, though. (The data_description mentions time interpolation "using Karl Taylor\'s time interpolation scheme (bcgen)" --- but my attempts to search for more info on bcgen are coming up short....)

Any other pointers would be most welcome!

-Rob
 
Top