Skip to content

Storage Overview

This page describes the storage systems available at the RCC and how to use them.

At A Glance#

  • Parallel (HPC) Storage (IBM Spectrum Scale "GPFS") - The primary storage system available on the HPC cluster
    • This is a 3.1 petabyte storage system.
    • Users are entitled to 150GB of personal storage space in their home directory (/gpfs/home/[username]).
    • Research groups have the option of purchasing dedicated volumes in the /gpfs/research volume.
    • Transfer data to and from GPFS using Globus, SFTP, or rsync.
    • More information and pricing »
  • Research Archival Storage (Ceph) - A low-cost, long-term storage platform for "at rest" research data
    • This is a 2.4 petabyte storage system.
    • Archival storage is available for purchase at a lower rate than HPC storage (GPFS) service.
    • Use Globus to transfer data to and from the Archival Storage System from anywhere on the Internet.
    • Transfer data to and from Research Archival using Globus (anywhere) or SFTP (on-campus only).
    • Ideal for custom applications with the need for unstructured data storage. Examples:
    • Storage of unstructured data like images, videos, or raw data files.
    • Storage of backup files, database dumps, and log files.
    • More information and pricing »

Accessing Filesystems#

You can access both systems via our server.

  • Parallel HPC storage is mounted by default on our export server, login cluster nodes, and all compute nodes in the following paths:
    • /gpfs/home - Individual users home directories are in here. By default, the system automatically places you in your home directory upon login.
    • /gpfs/research - Research groups that have purchased dedicated space on our parallel file system have directories in here.
  • Archival is mounted on our export server at the following path: /mnt/archival.


Our cluster supports a 40Gb/s uplink go our parallel storage (GPFS) and a 20Gb/s uplink to our Archival storage This is much faster than access our storage via HPC login nodes. Connect to this system in order to transfer a lot of data via SFTP or SCP/rsync.

Backup Disclaimer#

We do not provide backups for data on our storage systems. Both GPFS and Archival Storage are robust enterprise-grade storage systems with multiple redundancies and very high levels of reliability. However, as with any system, errors can occur.

We do not provide any guarantees or warranties for any data stored on our systems. We advise researchers to bear this in mind when planning for data storage needs.