December 18, 2020

Mission of HPCC


  • Campus-wide high-performance computing (HPC) infrastructure
  • Partners with other IT/HPC units at UCR and UC system
  • Training in HPC, big data processing and cloud computing

Online Portal

Hardware Infrastructure


  • Large stack of thousands of free and/or open-source software tools and packages managed via module system
  • Facility frequently installs new software upon user request
  • Users can install software themselves manually or via package manager and environment management systems such as Conda, CRAN/Bioc, etc.
  • Containerization using Singularity
  • Web-based environments: Jupyter Notebooks and RStudio Server
  • Commercial tools only where necessary:
    • GPFS for parallel storage system
    • Gaussian
    • MATLAB (to use ask to be added to license!)
    • Intel Parallel Suite
    • SAS and Stata
  • Additional commercial software can be installed if funding for license is available

Usage Stats

  • ~140 registered labs/groups:
    • ~28 departments
    • all colleges and schools: BCOE, CNAS, CHASS, SOM, SOB, SOPP
  • over 600 users

Contacts and Location

  • Contacts and Communication
  • Location
    • Offices (currently remote): 1208/1207 Genomics Building
    • Server room
      • Genomics Building, Rm 1120A
    • Server room for data back systems
      • CoLo Server Room in SOMED Building

Organization and Management of HPCC

  • Adminstered under UCR-wide Office of Research and Economic Development (RED)
  • Staff
    • Faculty director (Thomas Girke)
    • 2 HPC systems administrators (Jordan Hayes & new member)
    • Student HPC/Linux admin assistants (Melody Asghari, Abraham Park, et al.)
  • Faculty advisory board
  • Modular funding model
    • Recharging and ownership options
    • Grant funding
      • Equipment grants (e.g. NSF-MRI and NIH-S10)
      • Research grants (many agencies and programs)
    • UCR contribution

Recharging and Quotas

  • Subscription fee
    • Subscription based access model: $1000 per lab/yr. Includes any number of user accounts per lab, each with 20GB storage
    • Big data storage:
      • GB plan: $25/yr for 100GB (usable and backed up space)
      • TB plan: $1000/yr for 10TB (usable and backed-up space)
    • Ownership models: see next slide
  • Quota
    • Maximum CPU core usage limited to 512 CPU cores per lab and 256 CPU cores per user per partition
    • No charge for CPU hours! RAM quotas are queue specific.
  • Details about recharging rates: see here

Ownership Options

Ownership models for computer nodes and data storage. Attractive to labs that need 24/7 access to hundreds of CPU cores and more than 30TB storage.


  • Purchase hard drives @ current market price that will be added to GPFS storage
  • Annual maintenance fee ¼ of rental price
  • Often more cost effective for storage needs ≥30TB

Computer nodes

  • Purchase nodes at current market price with compatible network cards
  • Administered under a priority queueing system via private queue for owners; also increases overall CPU quota by the number of owned CPU cores.