Note: HPCC’s latest Recharging Rates document is here.
User account requests
- Email user account request to firstname.lastname@example.org. Please include email address and full name of both user and PI.
- An FAU for the annual registration fee (see below) is required if a PI’s lab is not registered yet.
HPCC’s recharging rate structure is outlined below. A more formal summary is available here.
Lab-based Registration Fee
An annual registration fee of $1,000 gives all members of a UCR lab access to our high-performance computing infrastructure. The registration provides access to the following resources:
- Over 6,800 CPU cores (60% Intel and 40% AMD), ~60,000 cuda cores (Nvidia K80,P100 GPUs), ~2PB parallel GPFS-based disk space, 512GB-1TB of memory/node, etc. More details are available on the hardware page.
- Over 1000 software packages and community databases. Details are available on the software page.
- Free attendance of workshops offered by HPCC staff
- Free consultation services (up to 1 hour per month)
- Note: there is no extra charge for CPU usage but each user and lab have CPU quotas of 256 and 512 CPU cores, respectively. Computing jobs exceeding these quotas can be submitted but will stay in a queued state until resources within the quota limits become available.
Big data storage
- Standard user accounts have a storage quota of 20 GB. To gain access to much larger storage pools, PIs have the option to rent or own storage space.
Storage rental option
- $1000 per 10TB of usable and backed up storage space. Storage pool is shared among all user accounts of a registered lab.
- Lab purchases storage hardware (e.g. hard drives) according to the specifications of the facility. Owned hard drives will be added to the facility’s parallel GPFS storage system. The annual support fee for owned disk storage is $250 per 10TB of usable and backed-up storage space. The owned storage space is only available to the users of a PI or those a PI wishes to give access to.
- Owned storage can be attractive for labs with storage needs above 40 TBs.
- Lab purchases compatible computer nodes (e.g. with supported network cards). An example of a popular high-density architecture is this one with 4 nodes each with two 16 core Intel chips (total physical core count 128), each node with 512GB of RAM, 1.2TB SSD and FDR-IB interconnect. Similar options exist for GPU nodes.
- Nodes are administered under a priority queueing system that gives users from an owner lab priority and also increases that lab’s overall CPU quota (see above) by the number of owned CPU cores.
- Owned computer nodes are an attractive solution for labs requiring 24/7 access to hundreds of CPU cores with no or only minor waiting times in queue.
- Registered users can email software install requests to HPCC’s issue tracking system @ email@example.com. Install requests are addressed in the order received. Simple installs are addressed within 1 to a few days. Complex installs may take longer.
Department cluster membership with owned computing nodes
This option addresses the need of department-level HPC access where the standard PI-based membership is not practical, e.g. provide cluster access to large number of undergraduate students in classes. Under this model a department purchases computer nodes that will be administered similarly as described above under the Ownership model. Due to the large number of expected users from departments, the CPU quota per user is lower compared to the PI-based model.
- The latest hardware/facility description (e.g. for grant applications) is available here.