Strelka Computer Cluster

Campus researchers have access to Strelka, Swarthmore's computer cluster.  To learn more about the capabilities of the system and obtain an account, email  

Technical Specifications

System Configuration

The cluster consists of 18 nodes, each with two CPUs with an additional head node to handle user logins and job scheduling. 

  • 12 mid-memory nodes (384GB - 512GB RAM)

  • 3 high-memory nodes (768GB RAM)

  • 1 high-CPU node (72 cores)

  • 2 GPU nodes, each with 4x NVIDIA 2080 Ti GPUs

  • Over 700TB storage

  • High speed Infiniband networking

Jobs are submitted through the Slurm job scheduling system.







18x Intel Xeon Gold 6230/6430

Total Cores


Total Memory



8x NVIDIA RTX 2080 Ti

User Storage


Creating an Account

Please see these instructions for creating an account on Strelka.

Logging in

Please see these instructions for logging into Strelka.  

Transferring Files

Please see these instructions for transferring files to/from Strelka.

Submitting Jobs

To run code on Strelka, you need to submit a job to the queue. For complete information, see the Slurm Commands page.

Other Strelka Resources


If you publish a paper where the cluster was used for calculation, please include the following acknowledgement:

“This work used the Strelka Computing Cluster, which is supported by the Swarthmore College Office of the Provost.”


An Advisory Group has been set up to govern the decisions concerning Strelka.  The members of the Advisory Group are:

  • Tristan Smith, Physics

  • Dan Grin, Haverford

  • Jason Simms, ITS

  • Andrew Ruether, ITS

Ways you can contact ITS or find information:

ITS Support Portal:
Phone: x4357 (HELP) or 610-328-8513
Check out our remote resources at
Check our homepage at