Although today’s handy laptops perform many advanced and computationally intensive tasks, projects involving Big Data require significantly more resources. That need is satisfied by the HPC infrastructure, built from a network of computing clusters combined with immense memory. Access to these resources is remote, so job submission and data preview occurs through an interface on any local computing machine from any (allowed) geolocation. The HPC infrastructure is a shared community space, so you might want to familiarize yourself with the usage policy to avoid disrupting peer work.

Table of contents

1. Introduction to HPC infrastructure

2. Secure Shell Connection (SSH)

3. Remote Data Access
(see more in section 7: Data Acquisition)

4. Available Software

5. Introduction to Job Scheduling

6. Introduction to GNU Parallel

7. Introduction to Containers


Homepage Prior Section Next Section