Campus researchers have access to Strelka, Swarthmore's computer cluster. To learn more about the capabilities of the system and obtain an account, email firstname.lastname@example.org.
The cluster consists of 18 nodes, each with two CPUs with an additional head node to handle user logins and job scheduling.
12 mid-memory nodes (384GB - 512GB RAM)
3 high-memory nodes (768GB RAM)
1 high-CPU node (72 cores)
2 GPU nodes, each with 4x NVIDIA 2080 Ti GPUs
Over 700TB storage
High speed Infiniband networking
Jobs are submitted through the Slurm job scheduling system.
18x Intel Xeon Gold 6230/6430
8x NVIDIA RTX 2080 Ti
Please see these instructions for creating an account on Strelka.
Please see these instructions for logging into Strelka.
Please see these instructions for transferring files to/from Strelka.
To run code on Strelka, you need to submit a job to the queue. For complete information, see the Slurm Commands page.
If you publish a paper where the cluster was used for calculation, please include the following acknowledgement:
“This work used the Strelka Computing Cluster, which is supported by the Swarthmore College Office of the Provost.”
An Advisory Group has been set up to govern the decisions concerning Strelka. The members of the Advisory Group are:
Tristan Smith, Physics
Dan Grin, Haverford
Jason Simms, ITS
Andrew Ruether, ITS
Ways you can contact ITS or find information:
ITS Support Portal: https://support.swarthmore.edu
Phone: x4357 (HELP) or 610-328-8513
Check out our remote resources at https://swatkb.atlassian.net/wiki/spaces/remote/overview
Check our homepage at https://swarthmore.edu/its