Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Problem With LW Downscaling in CLM5

I am experiencing a potential bug while running both CESM2 and standalone CLM5. In the case of standalone CLM5, I am running with a 1.1 km grid with topography at that resolution (and all other fields interpolated) in parts of the Andes, East Africa etc. Downscaling of fluxes and ground temperature looks fine. Naively, I'd expect FLDS downscaled to high altitude ice columns of approximately 100 Wm^-2. But there is no difference between the FLDS and FLDS_ICE fields, suggesting something has gone wrong with LW downscaling. (Each grid cell has typically 1% glacier coverage.) There have been no source code modifications from CESM 2.1.1, and the settings for longwave downscaling are in place in lnd_in. It should be no surprise that I think I may be getting more meltwater than appropriate in these simulations.

I have additional concern about this, because this behavior is mostly present in the CLM output file I have from the CMIP6 pre-industrial control for CESM2:

b.e21.B1850.f09_g17.CMIP6-piControl.001.clm2.h0.1100-12.nc

When I calculate FLDS-FLDS_ICE in this file (attached), it is zero over Greenland and Antarctica (and in temperate regions etc.) but something along the margins of Greenland. In other words, longwave downscaling seems to be happening in GLACIER_REGION=1 but not GLACIER_REGION=0,2,3.

Consulting Lawrence et al. (2019) and the relevant CLM5 documentation, I do not see any reason why LW downscaling should be restricted to one glacier_region type. Is this expected behavior? Should longwave downscaling only be happening along the margins of Greenland?

Best regards,

Nicholas G. Heavens, PhD, FGS, FRAS, FRMetS
Research Scientist
Space Science Institute
 

Attachments

  • Screen Shot 2021-04-07 at 9.54.56 PM.png
    Screen Shot 2021-04-07 at 9.54.56 PM.png
    133.3 KB · Views: 6
Looking through the downscaling code, I also should point out that my high resolution simulations impose an ice cover of only 1% (except where the standard CLM output would be greater). Thus, any excess LW should be re-distributed to the 99% ground not covered by ice. So the problem does not arise from setting 100% ice everywhere, which indeed would make FLDS and FLDS_ICE identical.
 
After doing many print statements, I have verified the code is working as programmed. If you need the grid scale FLDS field to compare with the downscaled columns, you should change the history output in atm2lndType.F90 to:

this%forc_lwrad_not_downscaled_grc(begg:endg) = spval
call hist_addfld1d (fname='FLDS', units='W/m^2', &
avgflag='A', long_name='atmospheric longwave radiation (grid scale)', &
ptr_gcell=this%forc_lwrad_not_downscaled_grc)

The LW downscaling routines are inappropriate for my standalone simulations, but that's not actually a bug and easily fixed with a simple source code modification.
 

sacks

Bill Sacks
CSEG and Liaisons
Staff member
Hi Nicholas,

I was going through my backlog of emails from last week and was about to reply to this one, then saw that it seems you have solved your problem. But this situation is tricky enough that I wanted to reply anyway, for your sake and the sake of others who may come across this thread.

Although longwave downscaling does not directly depend on glacier region, it does indirectly depend on glacier region. This is because the longwave downscaling is only done over certain columns. In regions where glacier_region_behavior = 'single_at_atm_topo' (areas outside Greenland and Antarctica in standard simulations), there is no downscaling at all. In regions where there are multiple glacier columns per grid cell, but where CISM is not running (Antarctica in standard simulations), downscaling is done over glacier columns but NOT over other landunits. If I remember correctly: because of the conservation normalization done to longwave radiation, this means that FLDS_ICE (which gives the grid cell average value of FLDS over glacier columns) will be constrained to be the same as the average FLDS received from the atmosphere: there is no transfer of longwave radiation between glacier and non-glacier landunits. Finally, in regions where CISM is running (Greenland in standard simulations), downscaling done over all landunits (other than urban); so it is only in these areas that you will see a difference between FLDS_ICE and FLDS. And, as you stated, FLDS_ICE is identical to FLDS over most of Greenland, because most grid cells are 100% ice.

I realize that these differences between regions can be very confusing. The reason we do it this way is: In regions where CISM is running, we have information about the elevation of non-glacier columns; and in this part of the world, we want to treat all landunits as similarly as possible to avoid spurious behavior changes when the ice sheet advances or retreats. Outside of the CISM domain, however, we do not have any information on the elevation of non-glacier columns, so we cannot do any meaningful downscaling of atmospheric fields over these non-glacier columns.

Finally, note that you can turn off the longwave downscaling via the namelist flag glcmec_downscale_longwave. You can set this to .false. in user_nl_clm.

Let me know if you have any further questions about this.
 
Top