aerler@atmosp_physics_utoronto_ca
New Member
Hi,I'm comparing precip extremes in CESM, WRF (10km), and station observations using the block-maxima/GEV approach. I'm using monthly PRECTMX output to get the CESM values.The results from WRF are as expected: the magntude is systematically lower than the observations (about 40% lower) but the distribution is similar. In CESM, however, the magnitude is systematically higher than observations (50% or more). This is exactly the opposite of what I would expect, since there should be a significant grid-averaging effect in CESM that reduces variance (and thus extremes). The average monthly mean precipitation in CESM is somewhat lower than in WRF, and WRF is generally closer to observations (especially the seasonal cycle is better).I'm not sure why the magnitude of precip extremes in CESM is so large. I'm assuming that PRECTMX simly gives me the daily average precip rate (in m/s = 1000 kg / m^2 / s) of the day with the highest precip total over the output period; is this correct?Does anyone have an explanation for this? I've attached a plot showing maximum precip in WRF (top row) and CESM (bottom row) for my area of interest. From left to right annual averages, summer averages, and winter averages of monthly precip maxima are shown (in mm/day).Any help would be greatly appreciated; I really don't know what to make of this. Thanks! Andre