Topics Map > •Research Computing

CRC NOTS Description

CRC NOTS Description

Introduction

NOTS (Night Owls Time-Sharing Service history of name) is a batch scheduled HPC/HTC cluster running on the Rice Big Research Data (BiRD) cloud infrastructure. The system consists of 298 dual socket compute blades housed within HPE s6500, HPE Apollo 2000, and Dell PowerEdge C6400 chassis. All the nodes are interconnected with 10 or 25 Gigabit Ethernet network. In addition, the Apollo and C6400 chassis are connected with high speed Omni-Path for message passing applications. There is a 300TB VAST filesystem attached to the compute nodes via Ethernet. The system can support various work loads including single node, parallel, large memory multithreaded, and GPU jobs.

NOTE:  An expansion to NOTS is available for testing.  For details, please see CRC NOTS Expansion (NOTSx)

 

NSF Citation

If you use NOTS to support your research activities, you are required to acknowledge (in publications, on your project web pages, …) the National Science Foundation grant that was used in part to fund the procurement of this system. An example acknowledgement that can be used follows. Feel free to modify wording for your specific needs but please keep the essential information:

This work was supported in part by the Big-Data Private-Cloud Research Cyberinfrastructure MRI-award funded by NSF under grant CNS-1338099 and by Rice University's Center for Research Computing (CRC).

Node Configurations

Compute

Compute nodes on NOTS
Hardware
Nodes
CPU
Cores
Hyperthreaded
RAM
Disk
High Speed Network
Storage Network
HPE SL230s 136 Intel(R) Xeon(R) CPU E5-2650 v2 @ 2.60GHz 16 Yes varies: 32 GB to 128 GB 4 TB/node None 10 GbE
HPE XL170r 28 Intel(R) Xeon(R) CPU E5-2650 v4 @ 2.20GHz 24 Yes varies: 32 GB to 128 GB 200 GB/node Omni-Path 10 GbE
Dell PowerEdge C6420 60 Intel(R) Xeon(R) Gold 6126 CPU @ 2.60GHz 24 Yes 192 GB 120 GB/node Omni-Path 10 GbE
HPE XL170r 52 Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz 40 Yes 192 GB 960 GB/node Omni-Path 25 GbE
HPE XL170r 8 Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz 40 Yes 768 GB 960 GB/node Omni-Path 25 GbE
HPE XL170r 4 Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz 40 Yes 1.5 TB 960 GB/node Omni-Path 25 GbE
Dell PowerEdge C6520 3 Intel(R) Xeon(R) Gold 6336Y CPU @ 2.40GHz 48 Yes 256 GB 960 GB/node Omni-Path 25 GbE

GPU

GPU nodes on NOTS
Hardware
Nodes
CPU
Cores
Hyperthreaded
RAM
Disk
GPU
High Speed Network
Storage Network
HPE SL270s 2 Intel(R) Xeon(R) CPU E5-2650 v2 @ 2.60GHz 16 Yes 128 GB 4 TB/node 4 x Tesla K80 None 10 GbE
HPE XL675d 3 AMD EPYC 7343 CPU @ 3.2GHz 32 Yes 512 GB 960 GB/node 8 x Tesla A40 Omni-Path 25 GbE
HPE XL190 16 Intel(R) Xeon(R) Gold 6230 CPU @ 2.10GHz 40 Yes 192 GB 960 GB/node 2 x Tesla V100 Omni-Path 25 GbE



KeywordsCRC NOTS Description NSF Citation   Doc ID108444
OwnerBryan R.GroupRice U
Created2021-01-19 15:17:42Updated2024-10-29 14:38:21
SitesRice University
Feedback  6   6