Scheduled Downtime
On Tuesday 24 October 2023 @ 5pm MT the forums will be in read only mode in preparation for the downtime. On Wednesday 25 October 2023 @ 5am MT, this website will be down for maintenance and expected to return online later in the morning.
Normal Operations
The forums are back online with normal operations. If you notice any issues or errors related to the forums, please reach out to help@ucar.edu

Forever queued in Derecho -- Not Running: Insufficient amount of resource: Qlist

Andrea

Andrea Salazar
New Member
Hi all,

I submitted a case on Derecho after everything successfully compiled. However, the job was queued for several days and when I qstat -f <jobID>, I get the comment:
Not Running: Insufficient amount of resource: Qlist

I have since tried to re-submit but am having the same problem.

If it is helpful, here is a full output from qstat:

Job Id: 1305682.desched1
Job_Name = run.piControl_debug_CLD_2
Job_Owner = asalazar@derecho6.hsn.de.hpc.ucar.edu
job_state = Q
queue = cpu
server = desched1.hsn.de.hpc.ucar.edu
Account_Name = UHAR0013
Checkpoint = u
ctime = Mon Aug 28 08:58:36 2023
depend = beforeok:1305683.desched1@desched1.hsn.de.hpc.ucar.edu
Error_Path = derecho6.hsn.de.hpc.ucar.edu:/glade/u/home/asalazar/cesm2/cesm
_caseroot/piControl_debug_CLD_2/run.piControl_debug_CLD_2.e1305682
Hold_Types = n
Join_Path = oe
Keep_Files = od
Mail_Points = a
mtime = Mon Aug 28 08:58:36 2023
Output_Path = derecho6.hsn.de.hpc.ucar.edu:/glade/u/home/asalazar/cesm2/ces
m_caseroot/piControl_debug_CLD_2/run.piControl_debug_CLD_2.o1305682
Priority = 200
qtime = Mon Aug 28 08:58:36 2023
Rerunable = False
Resource_List.mem = 4320gb
Resource_List.mpiprocs = 1152
Resource_List.ncpus = 576
Resource_List.ngpus = 0
Resource_List.nodect = 9
Resource_List.place = scatter:exclhost
Resource_List.preempt_targets = QUEUE=pcpu
Resource_List.select = 9:ncpus=64:mpiprocs=128:ompthreads=1:mem=480GB:ngpus
=0:mps=1
Resource_List.walltime = 04:00:00
Shell_Path_List = /bin/bash
substate = 10
Variable_List = PBS_O_HOME=/glade/u/home/asalazar,PBS_O_LANG=en_US.UTF-8,
PBS_O_LOGNAME=asalazar,
PBS_O_PATH=/glade/u/apps/cseg/derecho/23.06/spack/opt/spack/linux-sles
15-x86_64_v3/oneapi-2023.0.0/esmf-8.6.0b03-mj2dwmrl2uvuzvbyrvmqx35rul4m
qai7/bin:/glade/u/apps/derecho/23.06/spack/opt/spack/parallel-netcdf/1.
12.3/cray-mpich/8.1.25/oneapi/2023.0.0/blyr/bin:/glade/u/apps/derecho/2
3.06/spack/opt/spack/netcdf/4.9.2/cray-mpich/8.1.25/oneapi/2023.0.0/wzo
l/bin:/glade/u/apps/derecho/23.06/spack/opt/spack/hdf5/1.12.2/cray-mpic
h/8.1.25/oneapi/2023.0.0/ktun/bin:/glade/u/apps/derecho/23.06/spack/opt
/spack/ncarcompilers/1.0.0/oneapi/2023.0.0/ec7b/bin/mpi:/opt/cray/pe/pa
ls/1.2.11/bin:/opt/cray/libfabric/1.15.2.0/bin:/opt/cray/pe/mpich/8.1.2
5/ofi/intel/19.0/bin:/opt/cray/pe/mpich/8.1.25/bin:/glade/u/apps/derech
o/23.06/spack/opt/spack/cmake/3.26.3/gcc/7.5.0/l2rq/bin:/glade/u/apps/d
erecho/23.06/spack/opt/spack/ncarcompilers/1.0.0/oneapi/2023.0.0/ec7b/b
in:/glade/u/apps/derecho/23.06/spack/opt/spack/intel-oneapi-mkl/2023.0.
0/oneapi/2023.0.0/f2sh/mkl/2023.0.0/bin/intel64:/glade/u/apps/common/23
.04/spack/opt/spack/intel-oneapi-compilers/2023.0.0/compiler/2023.0.0/l
inux/lib/oclfpga/bin:/glade/u/apps/common/23.04/spack/opt/spack/intel-o
neapi-compilers/2023.0.0/compiler/2023.0.0/linux/bin/intel64:/glade/u/a
pps/common/23.04/spack/opt/spack/intel-oneapi-compilers/2023.0.0/compil
er/2023.0.0/linux/bin:/opt/cray/pe/craype/2.7.20/bin:/glade/u/apps/dere
cho/23.06/opt/utils/bin:/glade/u/apps/cseg/derecho/python/lib64/python3
.10/site-packages/bin:/ncar/usr/jupyterhub.hpc.ucar.edu/jupyterhub-2022
0511/bin:/glade/u/home/asalazar/.vscode-server/bin/6445d93c81ebe42c4cbd
7a60712e0b17d9463e97/bin/remote-cli:/opt/clmgr/sbin:/opt/clmgr/bin:/opt
/sgi/sbin:/opt/sgi/bin:/glade/u/home/asalazar/.local/bin:/opt/c3/bin:/u
sr/lib/mit/bin:/usr/lib/mit/sbin:/opt/pbs/bin:/glade/u/apps/derecho/23.
06/opt/bin:/usr/local/bin:/usr/bin:/sbin:/bin:/opt/cray/pe/bin,
PBS_O_MAIL=/var/spool/mail/asalazar,PBS_O_SHELL=/usr/bin/bash,
PBS_O_WORKDIR=/glade/u/home/asalazar/cesm2/cesm_caseroot/piControl_deb
ug_CLD_2,PBS_O_SYSTEM=Linux,ARGS_FOR_SCRIPT=--resubmit,PBS_O_QUEUE=main,
PBS_O_HOST=derecho6.hsn.de.hpc.ucar.edu
comment = Not Running: Insufficient amount of resource: Qlist
etime = Mon Aug 28 08:58:36 2023
umask = 22
run_count = 0
Submit_arguments = -q main -l walltime=04:00:00 -A UHAR0013 -v ARGS_FOR_SCR
IPT=--resubmit .case.run
project = _pbs_project_default
Submit_Host = derecho6.hsn.de.hpc.ucar.edu

Thank you for all advice!
 

sacks

Bill Sacks
CSEG and Liaisons
Staff member
Can you please send an email to help@ucar.edu to see if the systems group has any suggestions? If they feel it's a CESM-specific issue, then we'll try to help you here.
 
Top