Hi, I apologize for the complete newbie question (I asked this for CESM in another part of the forum but I also needed to ask for CLM run offline):
I've never run CLM before. I am trying to determine ~ how much resources I would need on my local university HPC cluster to do so. What are the typical resources required (in terms of number of nodes/cores) to run an global simulation of CLM at a decent resolution (e.g., 1deg) and for a few decades, within an acceptable amount of time? I'd also be very grateful if you could describe how much space is needed to store model outputs in that case (for a typical number of variables and temporal resolutions).
Thanks so much!
I've never run CLM before. I am trying to determine ~ how much resources I would need on my local university HPC cluster to do so. What are the typical resources required (in terms of number of nodes/cores) to run an global simulation of CLM at a decent resolution (e.g., 1deg) and for a few decades, within an acceptable amount of time? I'd also be very grateful if you could describe how much space is needed to store model outputs in that case (for a typical number of variables and temporal resolutions).
Thanks so much!