JavaScript is disabled. For a better experience, please enable JavaScript in your browser before proceeding.
You are using an out of date browser. It may not display this or other websites correctly.
You should upgrade or use an
alternative browser .
cesm1_2_2_CAMChem Running Issue - NetCDF: Invalid dimension ID or name
Sure, will port cesm 2 on our system, if that is an option. Just want to be sure that, can I do everything in cesm 2 that was possible in cesm1.2.2camchem ?
Hi,Well, I didn't shift to cesm 2. Was experimenting with cesm_1_2_2 cam chem.Created a case using:create_newcase -case -res T31_g37 -compset F1850CNCHM -mach This is a way too much small case I hope. And this is the error [ Specifically - 0: ALLOCATE: 8589934588 bytes requested; not enough memory] that I am getting.Can you please suggest what is going wrong here? Or what kind of memory does it require more? Stacksize, heap size? Or is there any alternative? (seq_comm_setcomm) initialize ID ( 1 GLOBAL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 2 CPL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 17 ATM ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 18 CPLATM ) join IDs = 2 17 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 3 ALLATMID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 10 CPLALLATMID ) join IDs = 2 3 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 19 LND ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 20 CPLLND ) join IDs = 2 19 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 4 ALLLNDID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 11 CPLALLLNDID ) join IDs = 2 4 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 21 OCN ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 22 CPLOCN ) join IDs = 2 21 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 5 ALLOCNID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 12 CPLALLOCNID ) join IDs = 2 5 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 23 ICE ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 24 CPLICE ) join IDs = 2 23 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 6 ALLICEID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 13 CPLALLICEID ) join IDs = 2 6 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 25 GLC ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 26 CPLGLC ) join IDs = 2 25 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 7 ALLGLCID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 14 CPLALLGLCID ) join IDs = 2 7 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 27 ROF ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 28 CPLROF ) join IDs = 2 27 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 8 ALLROFID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 15 CPLALLROFID ) join IDs = 2 8 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 29 WAV ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 30 CPLWAV ) join IDs = 2 29 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 9 ALLWAVID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 16 CPLALLWAVID ) join IDs = 2 9 ( npes = 16) ( nthreads = 1)(seq_comm_printcomms) 1 0 16 1 GLOBAL:(seq_comm_printcomms) 2 0 16 1 CPL:(seq_comm_printcomms) 3 0 16 1 ALLATMID:(seq_comm_printcomms) 4 0 16 1 ALLLNDID:(seq_comm_printcomms) 5 0 16 1 ALLOCNID:(seq_comm_printcomms) 6 0 16 1 ALLICEID:(seq_comm_printcomms) 7 0 16 1 ALLGLCID:(seq_comm_printcomms) 8 0 16 1 ALLROFID:(seq_comm_printcomms) 9 0 16 1 ALLWAVID:(seq_comm_printcomms) 10 0 16 1 CPLALLATMID:(seq_comm_printcomms) 11 0 16 1 CPLALLLNDID:(seq_comm_printcomms) 12 0 16 1 CPLALLOCNID:(seq_comm_printcomms) 13 0 16 1 CPLALLICEID:(seq_comm_printcomms) 14 0 16 1 CPLALLGLCID:(seq_comm_printcomms) 15 0 16 1 CPLALLROFID:(seq_comm_printcomms) 16 0 16 1 CPLALLWAVID:(seq_comm_printcomms) 17 0 16 1 ATM:(seq_comm_printcomms) 18 0 16 1 CPLATM:(seq_comm_printcomms) 19 0 16 1 LND:(seq_comm_printcomms) 20 0 16 1 CPLLND:(seq_comm_printcomms) 21 0 16 1 OCN:(seq_comm_printcomms) 22 0 16 1 CPLOCN:(seq_comm_printcomms) 23 0 16 1 ICE:(seq_comm_printcomms) 24 0 16 1 CPLICE:(seq_comm_printcomms) 25 0 16 1 GLC:(seq_comm_printcomms) 26 0 16 1 CPLGLC:(seq_comm_printcomms) 27 0 16 1 ROF:(seq_comm_printcomms) 28 0 16 1 CPLROF:(seq_comm_printcomms) 29 0 16 1 WAV:(seq_comm_printcomms) 30 0 16 1 CPLWAV: (t_initf) Read in prof_inparm namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory dealloc in MB is 0.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00 Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.50 seq_flds_mod: read seq_cplflds_inparm namelist from: drv_inseq_flds_mod: read seq_cplflds_userspec namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50seq_flds_mod: seq_flds_a2x_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:Sa_pslv:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_a2x_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4seq_flds_mod: seq_flds_x2a_states=Sf_lfrac:Sf_ifrac:Sf_ofrac:Sx_avsdr:Sx_anidr:Sx_avsdf:Sx_anidf:Sx_tref:Sx_qref:So_t:Sx_t:Sl_fv:Sl_ram1:Sl_snowh:Si_snowh:So_ssq:So_re:Sx_u10:So_ustar:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_x2a_fluxes=Faxx_taux:Faxx_tauy:Faxx_lat:Faxx_sen:Faxx_lwup:Faxx_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Fall_voc001seq_flds_mod: seq_flds_l2x_states=Sl_avsdr:Sl_anidr:Sl_avsdf:Sl_anidf:Sl_tref:Sl_qref:Sl_t:Sl_fv:Sl_ram1:Sl_snowh:Sl_u10:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_l2x_fluxes=Fall_swnet:Fall_taux:Fall_tauy:Fall_lat:Fall_sen:Fall_lwup:Fall_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Flrl_rofliq:Flrl_rofice:Fall_voc001seq_flds_mod: seq_flds_x2l_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Slrr_volr:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_x2l_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Flrr_floodseq_flds_mod: seq_flds_i2x_states=Si_avsdr:Si_anidr:Si_avsdf:Si_anidf:Si_tref:Si_qref:Si_t:Si_snowh:Si_u10:Si_ifracseq_flds_mod: seq_flds_i2x_fluxes=Faii_swnet:Fioi_swpen:Faii_taux:Fioi_taux:Faii_tauy:Fioi_tauy:Faii_lat:Faii_sen:Faii_lwup:Faii_evap:Fioi_melth:Fioi_meltw:Fioi_saltseq_flds_mod: seq_flds_x2i_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:So_t:So_s:So_u:So_v:So_dhdx:So_dhdyseq_flds_mod: seq_flds_x2i_fluxes=Faxa_rain:Faxa_snow:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Fioo_qseq_flds_mod: seq_flds_o2x_states=So_t:So_s:So_u:So_v:So_dhdx:So_dhdy:So_bldepthseq_flds_mod: seq_flds_o2x_fluxes=Fioo_qseq_flds_mod: seq_flds_x2o_states=Sa_pslv:So_duu10n:Si_ifrac:Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_x2o_fluxes=Faxa_rain:Faxa_snow:Faxa_prec:Faxa_lwdn:Foxx_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Foxx_taux:Foxx_tauy:Foxx_lat:Foxx_sen:Foxx_lwup:Foxx_evap:Fioi_melth:Fioi_meltw:Fioi_salt:Forr_roff:Forr_ioffseq_flds_mod: seq_flds_s2x_states= seq_flds_mod: seq_flds_s2x_fluxes= seq_flds_mod: seq_flds_x2s_states= seq_flds_mod: seq_flds_x2s_fluxes= seq_flds_mod: seq_flds_g2x_states= seq_flds_mod: seq_flds_g2x_fluxes= seq_flds_mod: seq_flds_x2g_states= seq_flds_mod: seq_flds_x2g_fluxes= seq_flds_mod: seq_flds_xao_states=So_tref:So_qref:So_ssq:So_re:So_u10:So_duu10n:So_ustarseq_flds_mod: seq_flds_xao_albedo=So_avsdr:So_anidr:So_avsdf:So_anidfseq_flds_mod: seq_flds_r2x_states=Slrr_volrseq_flds_mod: seq_flds_r2x_fluxes=Forr_roff:Forr_ioff:Flrr_floodseq_flds_mod: seq_flds_x2r_states= seq_flds_mod: seq_flds_x2r_fluxes=Flrl_rofliq:Flrl_roficeseq_flds_mod: seq_flds_w2x_states=Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_w2x_fluxes= seq_flds_mod: seq_flds_x2w_states=Sa_u:Sa_v:Sa_tbot:Si_ifrac:So_t:So_u:So_v:So_bldepthseq_flds_mod: seq_flds_x2w_fluxes= 16 pes participating in computation ----------------------------------- TASK# NAME 0 cn003 1 cn177 2 cn178 3 cn183 4 cn190 5 cn194 6 cn195 7 cn196 8 cn198 9 cn202 10 cn203 11 cn207 12 cn208 13 gpu140 14 cn227 15 cn237 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/inic/gaus/cami_0000-01-01_48x96_L26_c091218.nc 0 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/topo/USGS-gtopo30_48x96_c050520.nc 1 Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Divergence damper for spectral dycore NOT invoked DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 0: ALLOCATE: 8589934588 bytes requested; not enough memory0: ALLOCATE: 8589934588 bytes requested; not enough memory ==================================================================================== BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES= PID 3265 RUNNING AT cn003.hpc.iitd.ac.in= EXIT CODE: 9= CLEANING UP REMAINING PROCESSES= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES===================================================================================[proxy:0:3@cn183] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:3@cn183] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:3@cn183] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:6@cn195] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:6@cn195] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:6@cn195] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:10@cn203] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:10@cn203] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:10@cn203] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:1@cn177] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:1@cn177] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:1@cn177] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:7@cn196] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:7@cn196] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:7@cn196] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:13@gpu140] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:13@gpu140] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:13@gpu140] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:4@cn190] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:4@cn190] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:4@cn190] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:12@cn208] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:12@cn208] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:12@cn208] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:14@cn227] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:14@cn227] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:14@cn227] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:15@cn237] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:15@cn237] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:15@cn237] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:2@cn178] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:2@cn178] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:2@cn178] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:9@cn202] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:9@cn202] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:9@cn202] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:8@cn198] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:8@cn198] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:8@cn198] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[mpiexec@cn003] HYDT_bscu_wait_for_completion (tools/bootstrap/utils/bscu_wait.c:76): one of the processes terminated badly; aborting[mpiexec@cn003] HYDT_bsci_wait_for_completion (tools/bootstrap/src/bsci_wait.c:23): launcher returned error waiting for completion[mpiexec@cn003] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:218): launcher returned error waiting for completion[mpiexec@cn003] main (ui/mpich/mpiexec.c:340): process manager error waiting for completion Thanks,Vineet
Hi,Well, I didn't shift to cesm 2. Was experimenting with cesm_1_2_2 cam chem.Created a case using:create_newcase -case -res T31_g37 -compset F1850CNCHM -mach This is a way too much small case I hope. And this is the error [ Specifically - 0: ALLOCATE: 8589934588 bytes requested; not enough memory] that I am getting.Can you please suggest what is going wrong here? Or what kind of memory does it require more? Stacksize, heap size? Or is there any alternative? (seq_comm_setcomm) initialize ID ( 1 GLOBAL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 2 CPL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 17 ATM ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 18 CPLATM ) join IDs = 2 17 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 3 ALLATMID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 10 CPLALLATMID ) join IDs = 2 3 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 19 LND ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 20 CPLLND ) join IDs = 2 19 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 4 ALLLNDID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 11 CPLALLLNDID ) join IDs = 2 4 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 21 OCN ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 22 CPLOCN ) join IDs = 2 21 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 5 ALLOCNID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 12 CPLALLOCNID ) join IDs = 2 5 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 23 ICE ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 24 CPLICE ) join IDs = 2 23 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 6 ALLICEID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 13 CPLALLICEID ) join IDs = 2 6 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 25 GLC ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 26 CPLGLC ) join IDs = 2 25 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 7 ALLGLCID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 14 CPLALLGLCID ) join IDs = 2 7 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 27 ROF ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 28 CPLROF ) join IDs = 2 27 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 8 ALLROFID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 15 CPLALLROFID ) join IDs = 2 8 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 29 WAV ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 30 CPLWAV ) join IDs = 2 29 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 9 ALLWAVID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 16 CPLALLWAVID ) join IDs = 2 9 ( npes = 16) ( nthreads = 1)(seq_comm_printcomms) 1 0 16 1 GLOBAL:(seq_comm_printcomms) 2 0 16 1 CPL:(seq_comm_printcomms) 3 0 16 1 ALLATMID:(seq_comm_printcomms) 4 0 16 1 ALLLNDID:(seq_comm_printcomms) 5 0 16 1 ALLOCNID:(seq_comm_printcomms) 6 0 16 1 ALLICEID:(seq_comm_printcomms) 7 0 16 1 ALLGLCID:(seq_comm_printcomms) 8 0 16 1 ALLROFID:(seq_comm_printcomms) 9 0 16 1 ALLWAVID:(seq_comm_printcomms) 10 0 16 1 CPLALLATMID:(seq_comm_printcomms) 11 0 16 1 CPLALLLNDID:(seq_comm_printcomms) 12 0 16 1 CPLALLOCNID:(seq_comm_printcomms) 13 0 16 1 CPLALLICEID:(seq_comm_printcomms) 14 0 16 1 CPLALLGLCID:(seq_comm_printcomms) 15 0 16 1 CPLALLROFID:(seq_comm_printcomms) 16 0 16 1 CPLALLWAVID:(seq_comm_printcomms) 17 0 16 1 ATM:(seq_comm_printcomms) 18 0 16 1 CPLATM:(seq_comm_printcomms) 19 0 16 1 LND:(seq_comm_printcomms) 20 0 16 1 CPLLND:(seq_comm_printcomms) 21 0 16 1 OCN:(seq_comm_printcomms) 22 0 16 1 CPLOCN:(seq_comm_printcomms) 23 0 16 1 ICE:(seq_comm_printcomms) 24 0 16 1 CPLICE:(seq_comm_printcomms) 25 0 16 1 GLC:(seq_comm_printcomms) 26 0 16 1 CPLGLC:(seq_comm_printcomms) 27 0 16 1 ROF:(seq_comm_printcomms) 28 0 16 1 CPLROF:(seq_comm_printcomms) 29 0 16 1 WAV:(seq_comm_printcomms) 30 0 16 1 CPLWAV: (t_initf) Read in prof_inparm namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory dealloc in MB is 0.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00 Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.50 seq_flds_mod: read seq_cplflds_inparm namelist from: drv_inseq_flds_mod: read seq_cplflds_userspec namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50seq_flds_mod: seq_flds_a2x_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:Sa_pslv:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_a2x_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4seq_flds_mod: seq_flds_x2a_states=Sf_lfrac:Sf_ifrac:Sf_ofrac:Sx_avsdr:Sx_anidr:Sx_avsdf:Sx_anidf:Sx_tref:Sx_qref:So_t:Sx_t:Sl_fv:Sl_ram1:Sl_snowh:Si_snowh:So_ssq:So_re:Sx_u10:So_ustar:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_x2a_fluxes=Faxx_taux:Faxx_tauy:Faxx_lat:Faxx_sen:Faxx_lwup:Faxx_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Fall_voc001seq_flds_mod: seq_flds_l2x_states=Sl_avsdr:Sl_anidr:Sl_avsdf:Sl_anidf:Sl_tref:Sl_qref:Sl_t:Sl_fv:Sl_ram1:Sl_snowh:Sl_u10:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_l2x_fluxes=Fall_swnet:Fall_taux:Fall_tauy:Fall_lat:Fall_sen:Fall_lwup:Fall_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Flrl_rofliq:Flrl_rofice:Fall_voc001seq_flds_mod: seq_flds_x2l_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Slrr_volr:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_x2l_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Flrr_floodseq_flds_mod: seq_flds_i2x_states=Si_avsdr:Si_anidr:Si_avsdf:Si_anidf:Si_tref:Si_qref:Si_t:Si_snowh:Si_u10:Si_ifracseq_flds_mod: seq_flds_i2x_fluxes=Faii_swnet:Fioi_swpen:Faii_taux:Fioi_taux:Faii_tauy:Fioi_tauy:Faii_lat:Faii_sen:Faii_lwup:Faii_evap:Fioi_melth:Fioi_meltw:Fioi_saltseq_flds_mod: seq_flds_x2i_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:So_t:So_s:So_u:So_v:So_dhdx:So_dhdyseq_flds_mod: seq_flds_x2i_fluxes=Faxa_rain:Faxa_snow:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Fioo_qseq_flds_mod: seq_flds_o2x_states=So_t:So_s:So_u:So_v:So_dhdx:So_dhdy:So_bldepthseq_flds_mod: seq_flds_o2x_fluxes=Fioo_qseq_flds_mod: seq_flds_x2o_states=Sa_pslv:So_duu10n:Si_ifrac:Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_x2o_fluxes=Faxa_rain:Faxa_snow:Faxa_prec:Faxa_lwdn:Foxx_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Foxx_taux:Foxx_tauy:Foxx_lat:Foxx_sen:Foxx_lwup:Foxx_evap:Fioi_melth:Fioi_meltw:Fioi_salt:Forr_roff:Forr_ioffseq_flds_mod: seq_flds_s2x_states= seq_flds_mod: seq_flds_s2x_fluxes= seq_flds_mod: seq_flds_x2s_states= seq_flds_mod: seq_flds_x2s_fluxes= seq_flds_mod: seq_flds_g2x_states= seq_flds_mod: seq_flds_g2x_fluxes= seq_flds_mod: seq_flds_x2g_states= seq_flds_mod: seq_flds_x2g_fluxes= seq_flds_mod: seq_flds_xao_states=So_tref:So_qref:So_ssq:So_re:So_u10:So_duu10n:So_ustarseq_flds_mod: seq_flds_xao_albedo=So_avsdr:So_anidr:So_avsdf:So_anidfseq_flds_mod: seq_flds_r2x_states=Slrr_volrseq_flds_mod: seq_flds_r2x_fluxes=Forr_roff:Forr_ioff:Flrr_floodseq_flds_mod: seq_flds_x2r_states= seq_flds_mod: seq_flds_x2r_fluxes=Flrl_rofliq:Flrl_roficeseq_flds_mod: seq_flds_w2x_states=Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_w2x_fluxes= seq_flds_mod: seq_flds_x2w_states=Sa_u:Sa_v:Sa_tbot:Si_ifrac:So_t:So_u:So_v:So_bldepthseq_flds_mod: seq_flds_x2w_fluxes= 16 pes participating in computation ----------------------------------- TASK# NAME 0 cn003 1 cn177 2 cn178 3 cn183 4 cn190 5 cn194 6 cn195 7 cn196 8 cn198 9 cn202 10 cn203 11 cn207 12 cn208 13 gpu140 14 cn227 15 cn237 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/inic/gaus/cami_0000-01-01_48x96_L26_c091218.nc 0 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/topo/USGS-gtopo30_48x96_c050520.nc 1 Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Divergence damper for spectral dycore NOT invoked DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 0: ALLOCATE: 8589934588 bytes requested; not enough memory0: ALLOCATE: 8589934588 bytes requested; not enough memory ==================================================================================== BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES= PID 3265 RUNNING AT cn003.hpc.iitd.ac.in= EXIT CODE: 9= CLEANING UP REMAINING PROCESSES= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES===================================================================================[proxy:0:3@cn183] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:3@cn183] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:3@cn183] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:6@cn195] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:6@cn195] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:6@cn195] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:10@cn203] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:10@cn203] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:10@cn203] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:1@cn177] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:1@cn177] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:1@cn177] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:7@cn196] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:7@cn196] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:7@cn196] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:13@gpu140] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:13@gpu140] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:13@gpu140] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:4@cn190] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:4@cn190] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:4@cn190] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:12@cn208] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:12@cn208] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:12@cn208] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:14@cn227] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:14@cn227] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:14@cn227] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:15@cn237] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:15@cn237] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:15@cn237] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:2@cn178] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:2@cn178] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:2@cn178] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:9@cn202] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:9@cn202] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:9@cn202] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:8@cn198] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:8@cn198] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:8@cn198] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[mpiexec@cn003] HYDT_bscu_wait_for_completion (tools/bootstrap/utils/bscu_wait.c:76): one of the processes terminated badly; aborting[mpiexec@cn003] HYDT_bsci_wait_for_completion (tools/bootstrap/src/bsci_wait.c:23): launcher returned error waiting for completion[mpiexec@cn003] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:218): launcher returned error waiting for completion[mpiexec@cn003] main (ui/mpich/mpiexec.c:340): process manager error waiting for completion Thanks,Vineet
Hi,Well, I didn't shift to cesm 2. Was experimenting with cesm_1_2_2 cam chem.Created a case using:create_newcase -case -res T31_g37 -compset F1850CNCHM -mach This is a way too much small case I hope. And this is the error [ Specifically - 0: ALLOCATE: 8589934588 bytes requested; not enough memory] that I am getting.Can you please suggest what is going wrong here? Or what kind of memory does it require more? Stacksize, heap size? Or is there any alternative? (seq_comm_setcomm) initialize ID ( 1 GLOBAL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 2 CPL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 17 ATM ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 18 CPLATM ) join IDs = 2 17 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 3 ALLATMID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 10 CPLALLATMID ) join IDs = 2 3 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 19 LND ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 20 CPLLND ) join IDs = 2 19 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 4 ALLLNDID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 11 CPLALLLNDID ) join IDs = 2 4 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 21 OCN ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 22 CPLOCN ) join IDs = 2 21 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 5 ALLOCNID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 12 CPLALLOCNID ) join IDs = 2 5 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 23 ICE ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 24 CPLICE ) join IDs = 2 23 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 6 ALLICEID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 13 CPLALLICEID ) join IDs = 2 6 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 25 GLC ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 26 CPLGLC ) join IDs = 2 25 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 7 ALLGLCID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 14 CPLALLGLCID ) join IDs = 2 7 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 27 ROF ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 28 CPLROF ) join IDs = 2 27 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 8 ALLROFID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 15 CPLALLROFID ) join IDs = 2 8 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 29 WAV ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 30 CPLWAV ) join IDs = 2 29 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 9 ALLWAVID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 16 CPLALLWAVID ) join IDs = 2 9 ( npes = 16) ( nthreads = 1)(seq_comm_printcomms) 1 0 16 1 GLOBAL:(seq_comm_printcomms) 2 0 16 1 CPL:(seq_comm_printcomms) 3 0 16 1 ALLATMID:(seq_comm_printcomms) 4 0 16 1 ALLLNDID:(seq_comm_printcomms) 5 0 16 1 ALLOCNID:(seq_comm_printcomms) 6 0 16 1 ALLICEID:(seq_comm_printcomms) 7 0 16 1 ALLGLCID:(seq_comm_printcomms) 8 0 16 1 ALLROFID:(seq_comm_printcomms) 9 0 16 1 ALLWAVID:(seq_comm_printcomms) 10 0 16 1 CPLALLATMID:(seq_comm_printcomms) 11 0 16 1 CPLALLLNDID:(seq_comm_printcomms) 12 0 16 1 CPLALLOCNID:(seq_comm_printcomms) 13 0 16 1 CPLALLICEID:(seq_comm_printcomms) 14 0 16 1 CPLALLGLCID:(seq_comm_printcomms) 15 0 16 1 CPLALLROFID:(seq_comm_printcomms) 16 0 16 1 CPLALLWAVID:(seq_comm_printcomms) 17 0 16 1 ATM:(seq_comm_printcomms) 18 0 16 1 CPLATM:(seq_comm_printcomms) 19 0 16 1 LND:(seq_comm_printcomms) 20 0 16 1 CPLLND:(seq_comm_printcomms) 21 0 16 1 OCN:(seq_comm_printcomms) 22 0 16 1 CPLOCN:(seq_comm_printcomms) 23 0 16 1 ICE:(seq_comm_printcomms) 24 0 16 1 CPLICE:(seq_comm_printcomms) 25 0 16 1 GLC:(seq_comm_printcomms) 26 0 16 1 CPLGLC:(seq_comm_printcomms) 27 0 16 1 ROF:(seq_comm_printcomms) 28 0 16 1 CPLROF:(seq_comm_printcomms) 29 0 16 1 WAV:(seq_comm_printcomms) 30 0 16 1 CPLWAV: (t_initf) Read in prof_inparm namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory dealloc in MB is 0.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00 Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.50 seq_flds_mod: read seq_cplflds_inparm namelist from: drv_inseq_flds_mod: read seq_cplflds_userspec namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50seq_flds_mod: seq_flds_a2x_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:Sa_pslv:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_a2x_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4seq_flds_mod: seq_flds_x2a_states=Sf_lfrac:Sf_ifrac:Sf_ofrac:Sx_avsdr:Sx_anidr:Sx_avsdf:Sx_anidf:Sx_tref:Sx_qref:So_t:Sx_t:Sl_fv:Sl_ram1:Sl_snowh:Si_snowh:So_ssq:So_re:Sx_u10:So_ustar:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_x2a_fluxes=Faxx_taux:Faxx_tauy:Faxx_lat:Faxx_sen:Faxx_lwup:Faxx_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Fall_voc001seq_flds_mod: seq_flds_l2x_states=Sl_avsdr:Sl_anidr:Sl_avsdf:Sl_anidf:Sl_tref:Sl_qref:Sl_t:Sl_fv:Sl_ram1:Sl_snowh:Sl_u10:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_l2x_fluxes=Fall_swnet:Fall_taux:Fall_tauy:Fall_lat:Fall_sen:Fall_lwup:Fall_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Flrl_rofliq:Flrl_rofice:Fall_voc001seq_flds_mod: seq_flds_x2l_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Slrr_volr:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_x2l_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Flrr_floodseq_flds_mod: seq_flds_i2x_states=Si_avsdr:Si_anidr:Si_avsdf:Si_anidf:Si_tref:Si_qref:Si_t:Si_snowh:Si_u10:Si_ifracseq_flds_mod: seq_flds_i2x_fluxes=Faii_swnet:Fioi_swpen:Faii_taux:Fioi_taux:Faii_tauy:Fioi_tauy:Faii_lat:Faii_sen:Faii_lwup:Faii_evap:Fioi_melth:Fioi_meltw:Fioi_saltseq_flds_mod: seq_flds_x2i_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:So_t:So_s:So_u:So_v:So_dhdx:So_dhdyseq_flds_mod: seq_flds_x2i_fluxes=Faxa_rain:Faxa_snow:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Fioo_qseq_flds_mod: seq_flds_o2x_states=So_t:So_s:So_u:So_v:So_dhdx:So_dhdy:So_bldepthseq_flds_mod: seq_flds_o2x_fluxes=Fioo_qseq_flds_mod: seq_flds_x2o_states=Sa_pslv:So_duu10n:Si_ifrac:Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_x2o_fluxes=Faxa_rain:Faxa_snow:Faxa_prec:Faxa_lwdn:Foxx_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Foxx_taux:Foxx_tauy:Foxx_lat:Foxx_sen:Foxx_lwup:Foxx_evap:Fioi_melth:Fioi_meltw:Fioi_salt:Forr_roff:Forr_ioffseq_flds_mod: seq_flds_s2x_states= seq_flds_mod: seq_flds_s2x_fluxes= seq_flds_mod: seq_flds_x2s_states= seq_flds_mod: seq_flds_x2s_fluxes= seq_flds_mod: seq_flds_g2x_states= seq_flds_mod: seq_flds_g2x_fluxes= seq_flds_mod: seq_flds_x2g_states= seq_flds_mod: seq_flds_x2g_fluxes= seq_flds_mod: seq_flds_xao_states=So_tref:So_qref:So_ssq:So_re:So_u10:So_duu10n:So_ustarseq_flds_mod: seq_flds_xao_albedo=So_avsdr:So_anidr:So_avsdf:So_anidfseq_flds_mod: seq_flds_r2x_states=Slrr_volrseq_flds_mod: seq_flds_r2x_fluxes=Forr_roff:Forr_ioff:Flrr_floodseq_flds_mod: seq_flds_x2r_states= seq_flds_mod: seq_flds_x2r_fluxes=Flrl_rofliq:Flrl_roficeseq_flds_mod: seq_flds_w2x_states=Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_w2x_fluxes= seq_flds_mod: seq_flds_x2w_states=Sa_u:Sa_v:Sa_tbot:Si_ifrac:So_t:So_u:So_v:So_bldepthseq_flds_mod: seq_flds_x2w_fluxes= 16 pes participating in computation ----------------------------------- TASK# NAME 0 cn003 1 cn177 2 cn178 3 cn183 4 cn190 5 cn194 6 cn195 7 cn196 8 cn198 9 cn202 10 cn203 11 cn207 12 cn208 13 gpu140 14 cn227 15 cn237 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/inic/gaus/cami_0000-01-01_48x96_L26_c091218.nc 0 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/topo/USGS-gtopo30_48x96_c050520.nc 1 Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Divergence damper for spectral dycore NOT invoked DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 0: ALLOCATE: 8589934588 bytes requested; not enough memory0: ALLOCATE: 8589934588 bytes requested; not enough memory ==================================================================================== BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES= PID 3265 RUNNING AT cn003.hpc.iitd.ac.in= EXIT CODE: 9= CLEANING UP REMAINING PROCESSES= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES===================================================================================[proxy:0:3@cn183] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:3@cn183] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:3@cn183] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:6@cn195] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:6@cn195] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:6@cn195] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:10@cn203] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:10@cn203] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:10@cn203] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:1@cn177] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:1@cn177] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:1@cn177] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:7@cn196] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:7@cn196] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:7@cn196] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:13@gpu140] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:13@gpu140] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:13@gpu140] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:4@cn190] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:4@cn190] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:4@cn190] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:12@cn208] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:12@cn208] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:12@cn208] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:14@cn227] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:14@cn227] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:14@cn227] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:15@cn237] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:15@cn237] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:15@cn237] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:2@cn178] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:2@cn178] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:2@cn178] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:9@cn202] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:9@cn202] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:9@cn202] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:8@cn198] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:8@cn198] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:8@cn198] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[mpiexec@cn003] HYDT_bscu_wait_for_completion (tools/bootstrap/utils/bscu_wait.c:76): one of the processes terminated badly; aborting[mpiexec@cn003] HYDT_bsci_wait_for_completion (tools/bootstrap/src/bsci_wait.c:23): launcher returned error waiting for completion[mpiexec@cn003] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:218): launcher returned error waiting for completion[mpiexec@cn003] main (ui/mpich/mpiexec.c:340): process manager error waiting for completion Thanks,Vineet
Hi,Well, I didn't shift to cesm 2. Was experimenting with cesm_1_2_2 cam chem.Created a case using:create_newcase -case -res T31_g37 -compset F1850CNCHM -mach This is a way too much small case I hope. And this is the error [ Specifically - 0: ALLOCATE: 8589934588 bytes requested; not enough memory] that I am getting.Can you please suggest what is going wrong here? Or what kind of memory does it require more? Stacksize, heap size? Or is there any alternative? (seq_comm_setcomm) initialize ID ( 1 GLOBAL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 2 CPL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 17 ATM ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 18 CPLATM ) join IDs = 2 17 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 3 ALLATMID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 10 CPLALLATMID ) join IDs = 2 3 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 19 LND ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 20 CPLLND ) join IDs = 2 19 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 4 ALLLNDID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 11 CPLALLLNDID ) join IDs = 2 4 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 21 OCN ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 22 CPLOCN ) join IDs = 2 21 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 5 ALLOCNID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 12 CPLALLOCNID ) join IDs = 2 5 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 23 ICE ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 24 CPLICE ) join IDs = 2 23 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 6 ALLICEID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 13 CPLALLICEID ) join IDs = 2 6 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 25 GLC ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 26 CPLGLC ) join IDs = 2 25 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 7 ALLGLCID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 14 CPLALLGLCID ) join IDs = 2 7 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 27 ROF ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 28 CPLROF ) join IDs = 2 27 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 8 ALLROFID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 15 CPLALLROFID ) join IDs = 2 8 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 29 WAV ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 30 CPLWAV ) join IDs = 2 29 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 9 ALLWAVID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 16 CPLALLWAVID ) join IDs = 2 9 ( npes = 16) ( nthreads = 1)(seq_comm_printcomms) 1 0 16 1 GLOBAL:(seq_comm_printcomms) 2 0 16 1 CPL:(seq_comm_printcomms) 3 0 16 1 ALLATMID:(seq_comm_printcomms) 4 0 16 1 ALLLNDID:(seq_comm_printcomms) 5 0 16 1 ALLOCNID:(seq_comm_printcomms) 6 0 16 1 ALLICEID:(seq_comm_printcomms) 7 0 16 1 ALLGLCID:(seq_comm_printcomms) 8 0 16 1 ALLROFID:(seq_comm_printcomms) 9 0 16 1 ALLWAVID:(seq_comm_printcomms) 10 0 16 1 CPLALLATMID:(seq_comm_printcomms) 11 0 16 1 CPLALLLNDID:(seq_comm_printcomms) 12 0 16 1 CPLALLOCNID:(seq_comm_printcomms) 13 0 16 1 CPLALLICEID:(seq_comm_printcomms) 14 0 16 1 CPLALLGLCID:(seq_comm_printcomms) 15 0 16 1 CPLALLROFID:(seq_comm_printcomms) 16 0 16 1 CPLALLWAVID:(seq_comm_printcomms) 17 0 16 1 ATM:(seq_comm_printcomms) 18 0 16 1 CPLATM:(seq_comm_printcomms) 19 0 16 1 LND:(seq_comm_printcomms) 20 0 16 1 CPLLND:(seq_comm_printcomms) 21 0 16 1 OCN:(seq_comm_printcomms) 22 0 16 1 CPLOCN:(seq_comm_printcomms) 23 0 16 1 ICE:(seq_comm_printcomms) 24 0 16 1 CPLICE:(seq_comm_printcomms) 25 0 16 1 GLC:(seq_comm_printcomms) 26 0 16 1 CPLGLC:(seq_comm_printcomms) 27 0 16 1 ROF:(seq_comm_printcomms) 28 0 16 1 CPLROF:(seq_comm_printcomms) 29 0 16 1 WAV:(seq_comm_printcomms) 30 0 16 1 CPLWAV: (t_initf) Read in prof_inparm namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory dealloc in MB is 0.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00 Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.50 seq_flds_mod: read seq_cplflds_inparm namelist from: drv_inseq_flds_mod: read seq_cplflds_userspec namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50seq_flds_mod: seq_flds_a2x_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:Sa_pslv:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_a2x_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4seq_flds_mod: seq_flds_x2a_states=Sf_lfrac:Sf_ifrac:Sf_ofrac:Sx_avsdr:Sx_anidr:Sx_avsdf:Sx_anidf:Sx_tref:Sx_qref:So_t:Sx_t:Sl_fv:Sl_ram1:Sl_snowh:Si_snowh:So_ssq:So_re:Sx_u10:So_ustar:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_x2a_fluxes=Faxx_taux:Faxx_tauy:Faxx_lat:Faxx_sen:Faxx_lwup:Faxx_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Fall_voc001seq_flds_mod: seq_flds_l2x_states=Sl_avsdr:Sl_anidr:Sl_avsdf:Sl_anidf:Sl_tref:Sl_qref:Sl_t:Sl_fv:Sl_ram1:Sl_snowh:Sl_u10:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_l2x_fluxes=Fall_swnet:Fall_taux:Fall_tauy:Fall_lat:Fall_sen:Fall_lwup:Fall_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Flrl_rofliq:Flrl_rofice:Fall_voc001seq_flds_mod: seq_flds_x2l_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Slrr_volr:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_x2l_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Flrr_floodseq_flds_mod: seq_flds_i2x_states=Si_avsdr:Si_anidr:Si_avsdf:Si_anidf:Si_tref:Si_qref:Si_t:Si_snowh:Si_u10:Si_ifracseq_flds_mod: seq_flds_i2x_fluxes=Faii_swnet:Fioi_swpen:Faii_taux:Fioi_taux:Faii_tauy:Fioi_tauy:Faii_lat:Faii_sen:Faii_lwup:Faii_evap:Fioi_melth:Fioi_meltw:Fioi_saltseq_flds_mod: seq_flds_x2i_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:So_t:So_s:So_u:So_v:So_dhdx:So_dhdyseq_flds_mod: seq_flds_x2i_fluxes=Faxa_rain:Faxa_snow:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Fioo_qseq_flds_mod: seq_flds_o2x_states=So_t:So_s:So_u:So_v:So_dhdx:So_dhdy:So_bldepthseq_flds_mod: seq_flds_o2x_fluxes=Fioo_qseq_flds_mod: seq_flds_x2o_states=Sa_pslv:So_duu10n:Si_ifrac:Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_x2o_fluxes=Faxa_rain:Faxa_snow:Faxa_prec:Faxa_lwdn:Foxx_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Foxx_taux:Foxx_tauy:Foxx_lat:Foxx_sen:Foxx_lwup:Foxx_evap:Fioi_melth:Fioi_meltw:Fioi_salt:Forr_roff:Forr_ioffseq_flds_mod: seq_flds_s2x_states= seq_flds_mod: seq_flds_s2x_fluxes= seq_flds_mod: seq_flds_x2s_states= seq_flds_mod: seq_flds_x2s_fluxes= seq_flds_mod: seq_flds_g2x_states= seq_flds_mod: seq_flds_g2x_fluxes= seq_flds_mod: seq_flds_x2g_states= seq_flds_mod: seq_flds_x2g_fluxes= seq_flds_mod: seq_flds_xao_states=So_tref:So_qref:So_ssq:So_re:So_u10:So_duu10n:So_ustarseq_flds_mod: seq_flds_xao_albedo=So_avsdr:So_anidr:So_avsdf:So_anidfseq_flds_mod: seq_flds_r2x_states=Slrr_volrseq_flds_mod: seq_flds_r2x_fluxes=Forr_roff:Forr_ioff:Flrr_floodseq_flds_mod: seq_flds_x2r_states= seq_flds_mod: seq_flds_x2r_fluxes=Flrl_rofliq:Flrl_roficeseq_flds_mod: seq_flds_w2x_states=Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_w2x_fluxes= seq_flds_mod: seq_flds_x2w_states=Sa_u:Sa_v:Sa_tbot:Si_ifrac:So_t:So_u:So_v:So_bldepthseq_flds_mod: seq_flds_x2w_fluxes= 16 pes participating in computation ----------------------------------- TASK# NAME 0 cn003 1 cn177 2 cn178 3 cn183 4 cn190 5 cn194 6 cn195 7 cn196 8 cn198 9 cn202 10 cn203 11 cn207 12 cn208 13 gpu140 14 cn227 15 cn237 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/inic/gaus/cami_0000-01-01_48x96_L26_c091218.nc 0 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/topo/USGS-gtopo30_48x96_c050520.nc 1 Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Divergence damper for spectral dycore NOT invoked DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 0: ALLOCATE: 8589934588 bytes requested; not enough memory0: ALLOCATE: 8589934588 bytes requested; not enough memory ==================================================================================== BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES= PID 3265 RUNNING AT cn003.hpc.iitd.ac.in= EXIT CODE: 9= CLEANING UP REMAINING PROCESSES= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES===================================================================================[proxy:0:3@cn183] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:3@cn183] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:3@cn183] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:6@cn195] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:6@cn195] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:6@cn195] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:10@cn203] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:10@cn203] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:10@cn203] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:1@cn177] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:1@cn177] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:1@cn177] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:7@cn196] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:7@cn196] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:7@cn196] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:13@gpu140] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:13@gpu140] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:13@gpu140] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:4@cn190] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:4@cn190] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:4@cn190] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:12@cn208] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:12@cn208] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:12@cn208] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:14@cn227] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:14@cn227] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:14@cn227] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:15@cn237] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:15@cn237] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:15@cn237] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:2@cn178] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:2@cn178] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:2@cn178] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:9@cn202] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:9@cn202] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:9@cn202] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:8@cn198] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:8@cn198] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:8@cn198] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[mpiexec@cn003] HYDT_bscu_wait_for_completion (tools/bootstrap/utils/bscu_wait.c:76): one of the processes terminated badly; aborting[mpiexec@cn003] HYDT_bsci_wait_for_completion (tools/bootstrap/src/bsci_wait.c:23): launcher returned error waiting for completion[mpiexec@cn003] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:218): launcher returned error waiting for completion[mpiexec@cn003] main (ui/mpich/mpiexec.c:340): process manager error waiting for completion Thanks,Vineet
Hi,Well, I didn't shift to cesm 2. Was experimenting with cesm_1_2_2 cam chem.Created a case using:create_newcase -case -res T31_g37 -compset F1850CNCHM -mach This is a way too much small case I hope. And this is the error [ Specifically - 0: ALLOCATE: 8589934588 bytes requested; not enough memory] that I am getting.Can you please suggest what is going wrong here? Or what kind of memory does it require more? Stacksize, heap size? Or is there any alternative? (seq_comm_setcomm) initialize ID ( 1 GLOBAL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 2 CPL ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 17 ATM ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 18 CPLATM ) join IDs = 2 17 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 3 ALLATMID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 10 CPLALLATMID ) join IDs = 2 3 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 19 LND ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 20 CPLLND ) join IDs = 2 19 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 4 ALLLNDID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 11 CPLALLLNDID ) join IDs = 2 4 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 21 OCN ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 22 CPLOCN ) join IDs = 2 21 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 5 ALLOCNID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 12 CPLALLOCNID ) join IDs = 2 5 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 23 ICE ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 24 CPLICE ) join IDs = 2 23 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 6 ALLICEID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 13 CPLALLICEID ) join IDs = 2 6 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 25 GLC ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 26 CPLGLC ) join IDs = 2 25 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 7 ALLGLCID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 14 CPLALLGLCID ) join IDs = 2 7 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 27 ROF ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 28 CPLROF ) join IDs = 2 27 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 8 ALLROFID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 15 CPLALLROFID ) join IDs = 2 8 ( npes = 16) ( nthreads = 1)(seq_comm_setcomm) initialize ID ( 29 WAV ) pelist = 0 15 1 ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 30 CPLWAV ) join IDs = 2 29 ( npes = 16) ( nthreads = 1)(seq_comm_jcommarr) initialize ID ( 9 ALLWAVID ) join multiple comp IDs ( npes = 16) ( nthreads = 1)(seq_comm_joincomm) initialize ID ( 16 CPLALLWAVID ) join IDs = 2 9 ( npes = 16) ( nthreads = 1)(seq_comm_printcomms) 1 0 16 1 GLOBAL:(seq_comm_printcomms) 2 0 16 1 CPL:(seq_comm_printcomms) 3 0 16 1 ALLATMID:(seq_comm_printcomms) 4 0 16 1 ALLLNDID:(seq_comm_printcomms) 5 0 16 1 ALLOCNID:(seq_comm_printcomms) 6 0 16 1 ALLICEID:(seq_comm_printcomms) 7 0 16 1 ALLGLCID:(seq_comm_printcomms) 8 0 16 1 ALLROFID:(seq_comm_printcomms) 9 0 16 1 ALLWAVID:(seq_comm_printcomms) 10 0 16 1 CPLALLATMID:(seq_comm_printcomms) 11 0 16 1 CPLALLLNDID:(seq_comm_printcomms) 12 0 16 1 CPLALLOCNID:(seq_comm_printcomms) 13 0 16 1 CPLALLICEID:(seq_comm_printcomms) 14 0 16 1 CPLALLGLCID:(seq_comm_printcomms) 15 0 16 1 CPLALLROFID:(seq_comm_printcomms) 16 0 16 1 CPLALLWAVID:(seq_comm_printcomms) 17 0 16 1 ATM:(seq_comm_printcomms) 18 0 16 1 CPLATM:(seq_comm_printcomms) 19 0 16 1 LND:(seq_comm_printcomms) 20 0 16 1 CPLLND:(seq_comm_printcomms) 21 0 16 1 OCN:(seq_comm_printcomms) 22 0 16 1 CPLOCN:(seq_comm_printcomms) 23 0 16 1 ICE:(seq_comm_printcomms) 24 0 16 1 CPLICE:(seq_comm_printcomms) 25 0 16 1 GLC:(seq_comm_printcomms) 26 0 16 1 CPLGLC:(seq_comm_printcomms) 27 0 16 1 ROF:(seq_comm_printcomms) 28 0 16 1 CPLROF:(seq_comm_printcomms) 29 0 16 1 WAV:(seq_comm_printcomms) 30 0 16 1 CPLWAV: (t_initf) Read in prof_inparm namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 8 MB memory dealloc in MB is 0.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.008 MB memory alloc in MB is 8.00Memory block size conversion in bytes is 1023.508 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00 Memory block size conversion in bytes is 1023.508 MB memory dealloc in MB is 0.00 8 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50 Memory block size conversion in bytes is 1023.50Memory block size conversion in bytes is 1023.50 seq_flds_mod: read seq_cplflds_inparm namelist from: drv_inseq_flds_mod: read seq_cplflds_userspec namelist from: drv_in8 MB memory alloc in MB is 8.008 MB memory dealloc in MB is 0.00Memory block size conversion in bytes is 1023.50seq_flds_mod: seq_flds_a2x_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:Sa_pslv:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_a2x_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4seq_flds_mod: seq_flds_x2a_states=Sf_lfrac:Sf_ifrac:Sf_ofrac:Sx_avsdr:Sx_anidr:Sx_avsdf:Sx_anidf:Sx_tref:Sx_qref:So_t:Sx_t:Sl_fv:Sl_ram1:Sl_snowh:Si_snowh:So_ssq:So_re:Sx_u10:So_ustar:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_x2a_fluxes=Faxx_taux:Faxx_tauy:Faxx_lat:Faxx_sen:Faxx_lwup:Faxx_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Fall_voc001seq_flds_mod: seq_flds_l2x_states=Sl_avsdr:Sl_anidr:Sl_avsdf:Sl_anidf:Sl_tref:Sl_qref:Sl_t:Sl_fv:Sl_ram1:Sl_snowh:Sl_u10:Sl_dd001:Sl_dd002:Sl_dd003:Sl_dd004:Sl_dd005:Sl_dd006:Sl_dd007:Sl_dd008:Sl_dd009:Sl_dd010seq_flds_mod: seq_flds_l2x_fluxes=Fall_swnet:Fall_taux:Fall_tauy:Fall_lat:Fall_sen:Fall_lwup:Fall_evap:Fall_flxdst1:Fall_flxdst2:Fall_flxdst3:Fall_flxdst4:Flrl_rofliq:Flrl_rofice:Fall_voc001seq_flds_mod: seq_flds_x2l_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Slrr_volr:Sa_co2prog:Sa_co2diagseq_flds_mod: seq_flds_x2l_fluxes=Faxa_rainc:Faxa_rainl:Faxa_snowc:Faxa_snowl:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Flrr_floodseq_flds_mod: seq_flds_i2x_states=Si_avsdr:Si_anidr:Si_avsdf:Si_anidf:Si_tref:Si_qref:Si_t:Si_snowh:Si_u10:Si_ifracseq_flds_mod: seq_flds_i2x_fluxes=Faii_swnet:Fioi_swpen:Faii_taux:Fioi_taux:Faii_tauy:Fioi_tauy:Faii_lat:Faii_sen:Faii_lwup:Faii_evap:Fioi_melth:Fioi_meltw:Fioi_saltseq_flds_mod: seq_flds_x2i_states=Sa_z:Sa_u:Sa_v:Sa_tbot:Sa_ptem:Sa_shum:Sa_pbot:Sa_dens:So_t:So_s:So_u:So_v:So_dhdx:So_dhdyseq_flds_mod: seq_flds_x2i_fluxes=Faxa_rain:Faxa_snow:Faxa_lwdn:Faxa_swndr:Faxa_swvdr:Faxa_swndf:Faxa_swvdf:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Fioo_qseq_flds_mod: seq_flds_o2x_states=So_t:So_s:So_u:So_v:So_dhdx:So_dhdy:So_bldepthseq_flds_mod: seq_flds_o2x_fluxes=Fioo_qseq_flds_mod: seq_flds_x2o_states=Sa_pslv:So_duu10n:Si_ifrac:Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_x2o_fluxes=Faxa_rain:Faxa_snow:Faxa_prec:Faxa_lwdn:Foxx_swnet:Faxa_bcphidry:Faxa_bcphodry:Faxa_bcphiwet:Faxa_ocphidry:Faxa_ocphodry:Faxa_ocphiwet:Faxa_dstwet1:Faxa_dstwet2:Faxa_dstwet3:Faxa_dstwet4:Faxa_dstdry1:Faxa_dstdry2:Faxa_dstdry3:Faxa_dstdry4:Foxx_taux:Foxx_tauy:Foxx_lat:Foxx_sen:Foxx_lwup:Foxx_evap:Fioi_melth:Fioi_meltw:Fioi_salt:Forr_roff:Forr_ioffseq_flds_mod: seq_flds_s2x_states= seq_flds_mod: seq_flds_s2x_fluxes= seq_flds_mod: seq_flds_x2s_states= seq_flds_mod: seq_flds_x2s_fluxes= seq_flds_mod: seq_flds_g2x_states= seq_flds_mod: seq_flds_g2x_fluxes= seq_flds_mod: seq_flds_x2g_states= seq_flds_mod: seq_flds_x2g_fluxes= seq_flds_mod: seq_flds_xao_states=So_tref:So_qref:So_ssq:So_re:So_u10:So_duu10n:So_ustarseq_flds_mod: seq_flds_xao_albedo=So_avsdr:So_anidr:So_avsdf:So_anidfseq_flds_mod: seq_flds_r2x_states=Slrr_volrseq_flds_mod: seq_flds_r2x_fluxes=Forr_roff:Forr_ioff:Flrr_floodseq_flds_mod: seq_flds_x2r_states= seq_flds_mod: seq_flds_x2r_fluxes=Flrl_rofliq:Flrl_roficeseq_flds_mod: seq_flds_w2x_states=Sw_lamult:Sw_ustokes:Sw_vstokes:Sw_hstokesseq_flds_mod: seq_flds_w2x_fluxes= seq_flds_mod: seq_flds_x2w_states=Sa_u:Sa_v:Sa_tbot:Si_ifrac:So_t:So_u:So_v:So_bldepthseq_flds_mod: seq_flds_x2w_fluxes= 16 pes participating in computation ----------------------------------- TASK# NAME 0 cn003 1 cn177 2 cn178 3 cn183 4 cn190 5 cn194 6 cn195 7 cn196 8 cn198 9 cn202 10 cn203 11 cn207 12 cn208 13 gpu140 14 cn227 15 cn237 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/inic/gaus/cami_0000-01-01_48x96_L26_c091218.nc 0 Opened existing file /home/cas/faculty/dilipganguly/cesm/inputdata/atm/cam/topo/USGS-gtopo30_48x96_c050520.nc 1 Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17Divergence damper for spectral dycore NOT invoked Divergence damper for spectral dycore NOT invoked Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Divergence damper for spectral dycore NOT invoked DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Divergence damper for spectral dycore NOT invokedDivergence damper for spectral dycore NOT invoked Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 Time filter coefficient (EPS) 0.060 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL2 Horizontal diffusion coefficient (DIF2) 0.250E+06 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 DEL4 Horizontal diffusion coefficient (DIF4) 0.200E+17 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Number of levels Courant limiter applied 5 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 Dynamics Subcycling 1 0: ALLOCATE: 8589934588 bytes requested; not enough memory0: ALLOCATE: 8589934588 bytes requested; not enough memory ==================================================================================== BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES= PID 3265 RUNNING AT cn003.hpc.iitd.ac.in= EXIT CODE: 9= CLEANING UP REMAINING PROCESSES= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES===================================================================================[proxy:0:3@cn183] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:3@cn183] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:3@cn183] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:6@cn195] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:6@cn195] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:6@cn195] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:10@cn203] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:10@cn203] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:10@cn203] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:1@cn177] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:1@cn177] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:1@cn177] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:7@cn196] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:7@cn196] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:7@cn196] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:13@gpu140] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:13@gpu140] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:13@gpu140] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:4@cn190] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:4@cn190] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:4@cn190] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:12@cn208] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:12@cn208] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:12@cn208] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:14@cn227] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:14@cn227] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:14@cn227] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:15@cn237] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:15@cn237] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:15@cn237] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:2@cn178] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:2@cn178] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:2@cn178] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:9@cn202] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:9@cn202] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:9@cn202] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[proxy:0:8@cn198] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:887): assert (!closed) failed[proxy:0:8@cn198] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status[proxy:0:8@cn198] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event[mpiexec@cn003] HYDT_bscu_wait_for_completion (tools/bootstrap/utils/bscu_wait.c:76): one of the processes terminated badly; aborting[mpiexec@cn003] HYDT_bsci_wait_for_completion (tools/bootstrap/src/bsci_wait.c:23): launcher returned error waiting for completion[mpiexec@cn003] HYD_pmci_wait_for_completion (pm/pmiserv/pmiserv_pmci.c:218): launcher returned error waiting for completion[mpiexec@cn003] main (ui/mpich/mpiexec.c:340): process manager error waiting for completion Thanks,Vineet
CSEG and Liaisons
Staff member
No, the model doesn't have that kind of memory requirement, especially not at T31_g37. This allocation error would seem to indicate a problem in your compiler or otherwise in the local system somehow. You can dig in and try to debug it or look into using a different compiler version or vendor.
CSEG and Liaisons
Staff member
No, the model doesn't have that kind of memory requirement, especially not at T31_g37. This allocation error would seem to indicate a problem in your compiler or otherwise in the local system somehow. You can dig in and try to debug it or look into using a different compiler version or vendor.
CSEG and Liaisons
Staff member
No, the model doesn't have that kind of memory requirement, especially not at T31_g37. This allocation error would seem to indicate a problem in your compiler or otherwise in the local system somehow. You can dig in and try to debug it or look into using a different compiler version or vendor.
CSEG and Liaisons
Staff member
No, the model doesn't have that kind of memory requirement, especially not at T31_g37. This allocation error would seem to indicate a problem in your compiler or otherwise in the local system somehow. You can dig in and try to debug it or look into using a different compiler version or vendor.
CSEG and Liaisons
Staff member
No, the model doesn't have that kind of memory requirement, especially not at T31_g37. This allocation error would seem to indicate a problem in your compiler or otherwise in the local system somehow. You can dig in and try to debug it or look into using a different compiler version or vendor.
Hello,I just changed the compiler and the cesm version.
Now I am using intel compilers with cesm 2. I have resolved the issues related to my architecture.Currencly I am again facing an error related to netcdf. Please let me know whether the netcdf version (or pnetcdf) is having issues or am I making a mistake.I have attached the logs along with debug info.Thanks,Vineet
Hello,I just changed the compiler and the cesm version.
Now I am using intel compilers with cesm 2. I have resolved the issues related to my architecture.Currencly I am again facing an error related to netcdf. Please let me know whether the netcdf version (or pnetcdf) is having issues or am I making a mistake.I have attached the logs along with debug info.Thanks,Vineet
Hello,I just changed the compiler and the cesm version.
Now I am using intel compilers with cesm 2. I have resolved the issues related to my architecture.Currencly I am again facing an error related to netcdf. Please let me know whether the netcdf version (or pnetcdf) is having issues or am I making a mistake.I have attached the logs along with debug info.Thanks,Vineet
Hello,I just changed the compiler and the cesm version.
Now I am using intel compilers with cesm 2. I have resolved the issues related to my architecture.Currencly I am again facing an error related to netcdf. Please let me know whether the netcdf version (or pnetcdf) is having issues or am I making a mistake.I have attached the logs along with debug info.Thanks,Vineet
Hello,I just changed the compiler and the cesm version.
Now I am using intel compilers with cesm 2. I have resolved the issues related to my architecture.Currencly I am again facing an error related to netcdf. Please let me know whether the netcdf version (or pnetcdf) is having issues or am I making a mistake.I have attached the logs along with debug info.Thanks,Vineet
CSEG and Liaisons
Staff member
It looks like the last file opened by cam is corrupted or in a format not recognized by your netcdf. Look in the atm.log to determine what that file is, then use ncdump -k to determine the format of that file, you might try removing the file so that it will be downloaded again, often this will solve the problem
CSEG and Liaisons
Staff member
It looks like the last file opened by cam is corrupted or in a format not recognized by your netcdf. Look in the atm.log to determine what that file is, then use ncdump -k to determine the format of that file, you might try removing the file so that it will be downloaded again, often this will solve the problem
CSEG and Liaisons
Staff member
It looks like the last file opened by cam is corrupted or in a format not recognized by your netcdf. Look in the atm.log to determine what that file is, then use ncdump -k to determine the format of that file, you might try removing the file so that it will be downloaded again, often this will solve the problem
CSEG and Liaisons
Staff member
It looks like the last file opened by cam is corrupted or in a format not recognized by your netcdf. Look in the atm.log to determine what that file is, then use ncdump -k to determine the format of that file, you might try removing the file so that it will be downloaded again, often this will solve the problem