Topics Map > •Research Computing

CRC RANGE Partitions in SLURM

CRC RANGE Partitions in SLURM

Available Partitions, QOS, Wall Time Limits

RANGE Partitions
Partition Name Maximum GPUs per group Maximum Run Time (HH:MM:SS)
commons 32 24:00:00

The RANGE cluster currently has a single partition, commons, with a maximum run time of 24 hours.  This partition limits each group's jobs to 32 GPUs total across all the group's concurrent jobs using a grpTRES resource limit.

Use the following command to determine the partitions with which you have access. Please note in the output the Account column information needs to be provided to your batch script in addition to the partition information.

sacctmgr show assoc cluster=range user=netID

For general information about the partition, wall time and available nodes, use the sinfo command

[user@login2]$ sinfo PARTITION AVAIL TIMELIMIT NODES STATE NODELIST commons* up 1-00:00:00 12 mix bg2u16g1,bg2u24g1,bg3u16g1,bg3u24g1,bg4u7g1,bg4u9g1,bg4u11g1,bg4u13g1,bg5u16g1,bg5u24g1,bg6u16g1,bg6u24g1





Advanced usage:
Use the following command to determine raw QOS information for NOTS to see cluster characteristics. This information is presented in the above table to simplify the output.

sacctmgr show qos Names=nots_commons,nots_long,nots_debug,nots_scavenge



Keywords:
CRC RANGE Accounts qos queue partition slurm limits 
Doc ID:
156642
Owned by:
Clinton H. in Rice U
Created:
2025-11-11
Updated:
2026-03-23
Sites:
Rice University