CRC NOTS Description

CRC NOTS Description

Introduction


NOTS (Night Owls Time-Sharing Service history of name) is a batch scheduled HPC/HTC cluster running on the Rice Big Research Data (BiRD) cloud infrastructure. The system consists of 298 dual socket compute blades housed within HPE s6500, HPE Apollo 2000, and Dell PowerEdge C6400 chassis. All the nodes are interconnected with 10 or 25 Gigabit Ethernet network. In addition, the Apollo and C6400 chassis are connected with high speed Omni-Path for message passing applications. There is a 160TB Lustre filesystem attached to the compute nodes via Ethernet. The system can support various work loads including single node, parallel, large memory multithreaded, and GPU jobs.

Compute

Hardware
Nodes
CPU
Cores
Hyperthreaded
RAM
Disk
High Speed Network
Storage Network
HPE SL230s 136 Intel(R) Xeon(R) CPU E5-2650 v2 @ 2.60GHz 16 Yes varies: 32 GB to 128 GB 4 TB/node None 10 GbE
HPE XL170r 28 Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz 24 Yes varies: 32 GB to 128 GB 200 GB/node Omni-Path 10 GbE
Dell PowerEdge C6420 60 Intel(R) Xeon(R) Gold 6126 CPU @ 2.60GHz 24 Yes 192 GB 120 GB/node Omni-Path 10 GbE
HPE XL170r 52 Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz 40 Yes 192 GB 960 GB/node Omni-Path 25 GbE
HPE XL170r 4 Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz 40 Yes 1.5 TB 960 GB/node Omni-Path 25 GbE

GPU

Hardware
Nodes
CPU
Cores
Hyperthreaded
RAM
Disk
GPU
High Speed Network
Storage Network
HPE SL270s 2 Intel(R) Xeon(R) CPU E5-2650 v2 @ 2.60GHz 16 Yes 128 GB 4 TB/node 4 x Tesla K80 None 10 GbE
HPE XL190 16 Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz 40 Yes 192 GB 960 GB/node 2 x Tesla V100 Omni-Path 25 GbE





Keywords:CRC NOTS Description   Doc ID:108444
Owner:Joseph G.Group:Rice University
Created:2021-01-19 16:17 CDTUpdated:2021-02-26 10:08 CDT
Sites:Rice University
Feedback:  0   0