Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Ocean_only configuration of regional MOM6 reads atmospheric forcing only at initial time

Maria.Aristizabal

Maria Aristizabal
New Member
Hello,
I am running an Ocean_Only configuration of a regional MOM6 in the North Atlantic. I am providing atmospheric forcing from the Global Forecasting System (GFS) every 3 hours. The variables provided in the atmospheric forcing file are: u and v wind stresses, long and short-wave radiation fluxes, sensible and latent heat fluxes, precipitation rate and others. The model runs in this configuration, however when I check the model output, the surface stresses (taux and tauy), the shortwave and longwave radiation flux into the ocean (SW and LW), the latent and sensible flux heat into the ocean and the surface friction velocity (ustar) are all constant throug the run and very close to the fields provided in the atmospheric forcing at the initial time of the run. In addition, the sea surface temperature increases linearly throug the run and it does not exhibit a daily cycle. This indicates that MOM6 is only reading the atmospheric forcing at the initial time of the run. Instead, I need the model to read the atmospheric forcing at least every 3 hours, that is the time step provided by the GFS output.
If anybody has suggestions of how to fix this issue it would be greatly appreciated.
Regards,
Maria Aristizabal
 
Hi Maria! My understanding is that you need to run the coupled ice-ocean model to run with that sort of atmospheric forcing. It is the ice model which computes the bulk fluxes. For CESM it is the coupler which computes these things.
 

Maria.Aristizabal

Maria Aristizabal
New Member
Hi Kate!
Thanks for the quick response! Initially we are not coupling MOM6 with any other components. We did not want to couple it with the ice model because we are trying to set up MOM6 as the ocean component of the Hurricane Analysis and Forecasting System (HAFS). So eventually it will be coupled to FV3. But initially we are trying to set it up in the ocean_only configuration. But it seems that in this configuration is either not possible to read the atmospheric forcing the way we need it, or we are doing it incorrectly.
 

gmarques

Gustavo Marques
Moderator
Staff member
Hi Maria,
Take a look in the MOM_input settings used in this example to make sure you have the correct surface forcing options enabled for ocean only. Also, make sure the time covered in the atmospheric forcing files is consistent with the model time set in input.nml.
Best,
Gustavo
 

Maria.Aristizabal

Maria Aristizabal
New Member
Hi Gustavo,
Thanks for your answer!
I compared my settings in MOM_input with your example for the module MOM_surface_forcing and my flags are the same, except that READ_GUST_2D = False in my settings. I also made sure that the time our forcing file is within the time window of the run. For example, our forcing file starts 6 hours before the initial time of the run so that should be fine. What else can I check for? Best,
Maria
 

gmarques

Gustavo Marques
Moderator
Staff member
Hi Maria,

One more thing to check is the attributes of the forcing file(s) -- in particular those associated with variable "time". Are they consistent with the model settings?
Below is an example from MOM6-examples/ocean_only/global_ALE/z/INPUT/ocean_forcing_daily.nc

Code:
netcdf ocean_forcing_daily {
dimensions:
    xh = 360 ;
    yh = 210 ;
    time = UNLIMITED ; // (365 currently)
    nv = 2 ;
    xq = 360 ;
    yq = 210 ;
variables:
    double xh(xh) ;
        xh:long_name = "h point nominal longitude" ;
        xh:units = "degrees_E" ;
        xh:cartesian_axis = "X" ;
    double yh(yh) ;
        yh:long_name = "h point nominal latitude" ;
        yh:units = "degrees_N" ;
        yh:cartesian_axis = "Y" ;
    double time(time) ;
        time:long_name = "time" ;
        time:units = "days since 0001-01-01 00:00:00" ;
        time:cartesian_axis = "T" ;
        time:calendar_type = "NOLEAP" ;
        time:calendar = "NOLEAP" ;
        time:bounds = "time_bounds" ;
    double nv(nv) ;
        nv:long_name = "vertex number" ;
        nv:units = "none" ;
        nv:cartesian_axis = "N" ;
    double xq(xq) ;
        xq:long_name = "q point nominal longitude" ;
        xq:units = "degrees_E" ;
        xq:cartesian_axis = "X" ;
    double yq(yq) ;
        yq:long_name = "q point nominal latitude" ;
        yq:units = "degrees_N" ;
        yq:cartesian_axis = "Y" ;
    float SW(time, yh, xh) ;
        SW:long_name = "Shortwave radiation flux into ocean" ;
        SW:units = "Watt meter-2" ;
        SW:missing_value = 1.e+20f ;
        SW:_FillValue = 1.e+20f ;
        SW:cell_methods = "time: mean" ;
        SW:time_avg_info = "average_T1,average_T2,average_DT" ;
        SW:standard_name = "surface_net_downward_shortwave_flux" ;
    float LW(time, yh, xh) ;
        LW:long_name = "Longwave radiation flux into ocean" ;
        LW:units = "Watt meter-2" ;
        LW:missing_value = 1.e+20f ;
        LW:_FillValue = 1.e+20f ;
        LW:cell_methods = "time: mean" ;
        LW:time_avg_info = "average_T1,average_T2,average_DT" ;
        LW:standard_name = "surface_net_downward_longwave_flux" ;
    float latent(time, yh, xh) ;
        latent:long_name = "Latent heat flux into ocean due to fusion and evaporation" ;
        latent:units = "Watt meter-2" ;
        latent:missing_value = 1.e+20f ;
        latent:_FillValue = 1.e+20f ;
        latent:cell_methods = "time: mean" ;
        latent:time_avg_info = "average_T1,average_T2,average_DT" ;
    float sensible(time, yh, xh) ;
        sensible:long_name = "Sensible heat flux into ocean" ;
        sensible:units = "Watt meter-2" ;
        sensible:missing_value = 1.e+20f ;
        sensible:_FillValue = 1.e+20f ;
        sensible:cell_methods = "time: mean" ;
        sensible:time_avg_info = "average_T1,average_T2,average_DT" ;
        sensible:standard_name = "surface_downward_sensible_heat_flux" ;
    float evap(time, yh, xh) ;
        evap:long_name = "Evaporation at ocean surface (usually negative)" ;
        evap:units = "kilogram meter-2 second-1" ;
        evap:missing_value = 1.e+20f ;
        evap:_FillValue = 1.e+20f ;
        evap:cell_methods = "time: mean" ;
        evap:time_avg_info = "average_T1,average_T2,average_DT" ;
    float taux(time, yh, xq) ;
        taux:long_name = "Zonal Wind Stress" ;
        taux:units = "Pascal" ;
        taux:missing_value = 1.e+20f ;
        taux:_FillValue = 1.e+20f ;
        taux:cell_methods = "time: mean" ;
        taux:time_avg_info = "average_T1,average_T2,average_DT" ;
        taux:standard_name = "surface_downward_x_stress" ;
    float tauy(time, yq, xh) ;
        tauy:long_name = "Meridional Wind Stress" ;
        tauy:units = "Pascal" ;
        tauy:missing_value = 1.e+20f ;
        tauy:_FillValue = 1.e+20f ;
        tauy:cell_methods = "time: mean" ;
        tauy:time_avg_info = "average_T1,average_T2,average_DT" ;
        tauy:standard_name = "surface_downward_y_stress" ;
    float ustar(time, yh, xh) ;
        ustar:long_name = "Surface friction velocity" ;
        ustar:units = "meter second-1" ;
        ustar:missing_value = 1.e+20f ;
        ustar:_FillValue = 1.e+20f ;
        ustar:cell_methods = "time: mean" ;
        ustar:time_avg_info = "average_T1,average_T2,average_DT" ;
    float SST(time, yh, xh) ;
        SST:long_name = "Sea Surface Temperature" ;
        SST:units = "Celsius" ;
        SST:missing_value = -1.e+34f ;
        SST:_FillValue = -1.e+34f ;
        SST:cell_methods = "time: mean" ;
        SST:time_avg_info = "average_T1,average_T2,average_DT" ;
    float SSS(time, yh, xh) ;
        SSS:long_name = "Sea Surface Salinity" ;
        SSS:units = "PSU" ;
        SSS:missing_value = -1.e+34f ;
        SSS:_FillValue = -1.e+34f ;
        SSS:cell_methods = "time: mean" ;
        SSS:time_avg_info = "average_T1,average_T2,average_DT" ;
    double average_T1(time) ;
        average_T1:long_name = "Start time for average period" ;
        average_T1:units = "days since 0001-01-01 00:00:00" ;
        average_T1:missing_value = 1.e+20 ;
        average_T1:_FillValue = 1.e+20 ;
    double average_T2(time) ;
        average_T2:long_name = "End time for average period" ;
        average_T2:units = "days since 0001-01-01 00:00:00" ;
        average_T2:missing_value = 1.e+20 ;
        average_T2:_FillValue = 1.e+20 ;
    double average_DT(time) ;
        average_DT:long_name = "Length of average period" ;
        average_DT:units = "days" ;
        average_DT:missing_value = 1.e+20 ;
        average_DT:_FillValue = 1.e+20 ;
    double time_bounds(time, nv) ;
        time_bounds:long_name = "time axis boundaries" ;
        time_bounds:units = "days" ;
        time_bounds:missing_value = 1.e+20 ;
        time_bounds:_FillValue = 1.e+20 ;

// global attributes:
        :filename = "ocean_forcing_daily.nc" ;
        :title = "GOLD_SIS" ;
        :grid_type = "regular" ;
        :grid_tile = "N/A" ;
}
 

Maria.Aristizabal

Maria Aristizabal
New Member
Hi Gustavo,
Thanks for your suggestion! I checked the atmospheric forcing netcdf file that I produced and many of the attributes are consistent with the netcdf file attributes in your example. But I noticed that your netcdf file has an extra dimension and variable that I do not have: time axis boundaries nv. Is this variable necessary for MOM6 to read correctly the atmospheric forcing in the ocean_only configuration?

I also noticed that the wind stresses are in the staggered grid. When I produced my netcdf file with the wind stresses in the staggered grid, I got the following error message:

NOTE from PE 0: GLOBAL ATT too long - not reading this metadata
1135 633 1 1136 633 1

FATAL from PE 0: fms_io(read_data_3d_new), field UFLX_surface in file INPUT/atm.nc: field size mismatch 1


FATAL from PE 0: fms_io(read_data_3d_new), field UFLX_surface in file INPUT/atm.nc: field size mismatch 1

Abort(1) on node 0 (rank 0 in comm 0): application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0
In: PMI_Abort(1, application called MPI_Abort(MPI_COMM_WORLD, 1) - process 0)
1135 633 1 1136 633 1


I am copying here the dimensions and the attributes of the time variable of the netcdf file that I am using.

**************************************************
dimensions:
time = UNLIMITED ; // (42 currently)
lath = 633 ;
lonh = 1135 ;
lonq = 1136 ;
latq = 634 ;

double time(time) ;
time:units = "seconds since 1970-01-01 00:00:00" ;
time:long_name = "time" ;
time:reference_time = 1598313600. ;
time:reference_time_type = 3 ;
time:reference_date = "2020.08.25 00:00:00 UTC" ;
time:reference_time_description = " " ;
time:time_step_setting = "auto" ;
time:time_step = 0. ;
time:short_name = "time" ;
time:calendar = "julian" ;
 

gmarques

Gustavo Marques
Moderator
Staff member
Hi Maria,

I do not know if time_bounds is needed to read the atmospheric forcing correctly. Instead of testing that on your configuration, I suggest making sure you can run and modify MOM6-examples/ocean_only/global_ALE/z first.
 

gmarques

Gustavo Marques
Moderator
Staff member
Hi Maria,
The FTP link is working for me, I just logged in as a guest. I do not know if these datasets are available in other NOAA machines.
 

jsteffen

John Steffen
New Member
Hi Gustavo,

I've been working with Maria to set up our ocean-only configuration of MOM6. We took your advice and ran the /global_ALE/z/ example case. However, I am getting the same error when reading in the forcing file, /INPUT/ocean_forcing_daily.nc. Any idea on what could cause this?


WARNING from PE 0: num_timelevels: variable taux in file INPUT/ocean_forcing_daily.nc has fewer than min_dims = 3 dimensions.

FATAL from PE 35: NETCDF ERROR: NetCDF: HDF error File=INPUT/ocean_forcing_daily.nc


Thank you for your feedback!
John
 

gmarques

Gustavo Marques
Moderator
Staff member
Hi John,

What version of MOM6 and FMS are you using? There has been some changes in the native netCDF calls and that might be the issue. To be safe, I suggest using the heads specified here.
 

jsteffen

John Steffen
New Member
Hi Gustavo,

Thank you for the help!

I checked out MOM6-examples back in January. The versions of MOM6 and FMS that produced the failed test are these,
MOM6 - commit df46be459304cc3571d0a2c7cc5727fed41ff076
FMS - commit f61416fef691d9ba39a40df1ce72aa574f54c390

I went ahead and checked out MOM6-examples main branch with the heads you suggested (git clone -b main --recursive GitHub - NOAA-GFDL/MOM6-examples: Example configurations for MOM6 and SIS2 ./) . I reran the test and received the same error message I listed in the prior post.
MOM6 - commit 233f982c23e91e0b211b57c43e69a62bf7f0e4fa
FMS - commit f61416fef691d9ba39a40df1ce72aa574f54c390

So, this leads me to think that the different versions of MOM6 and FMS aren't the cause of this error. Do you have an idea of what else could cause the error? Thank you!

John
 

gmarques

Gustavo Marques
Moderator
Staff member
Hi John,

That's puzzling. I've been testing all the cases in MOM6-examples on Cheyenne using the Intel compiler. Here are the modules I used on my latest test:
  • intel/19.1.1
  • netcdf/4.7.4
  • mpt/2.22
Which compiler and netCDF library are you using?
 

jsteffen

John Steffen
New Member
Hi Gustavo,

We have been running our own tests and the MOM6-examples cases on Orion using intel. Here are the modules we are using (among others):

intel/2018.4
impi/2018.4
netcdfp/4.7.4

-John
 

marshallward

Marshall Ward
New Member
Maria, you may be seeing a bug which was recently detected and fixed:


I am not sure if MOM6-examples was pointing to this version when you last checked out your code.

Could you try either the NOAA-GFDL:dev/gfdl branch, or the mom-ocean:main branch?

 

Jiande

Jiande Wang
New Member
John: can you tell me your run directory on ORION ? Make sure you do chmod as ORION is by default not accessible to other group people
 

Maria.Aristizabal

Maria Aristizabal
New Member
Hi Marshall,
Thanks for your response. The machine where I have been running MOM6 is under maintenance today but as soon as I have access I will pull the MOM6 submodule, so it points to the latest version of the NOAA-GFDL:dev/gfdl branch. I am really hoping this fixes are issue. I will let you know how it goes.
Maria
 

Jiande

Jiande Wang
New Member
I just had a fresh clone of code and tried ocean-only/global_ALE/z run on ORION, it works fine. Some extra information here:
MOM6-example code: dev/gfdl 63c227f MOM6: (*)Updated version of MOM6 after MOM6 PR #69, this is the head of dev/gfdl
my run directory: /work/noaa/marine/Jiande.Wang/working/MOM6-z-test/RUN/ocean-only/global_ALE/z
Note here I used FMS1 infra code, switching to FMS2 infra code will lead the run with segmentation fault
 
Top