You are here

High Performance Computing (HPC)

Temple offers three high performance computing Linux environments to provide university researchers sufficient computing power to do their intensive data analysis.

Overview

The HPC environment consists of:

  • Owl's Nest (owlsnest.hpc.temple.edu)
  • Compute (compute.temple.edu)
  • Machine Learning ( gpu.hpc.temple.edu & dgx-1.hpc.temple.edu )

Use of this environment is restricted to those users with significant computational and/or large data manipulation needs.

The HPC is designed to meet the needs of the following:

  • Faculty, staff, graduate and professional students with a current AccessNet account requiring high end computing functionality
  • Undergraduate students with faculty sponsorship requiring high end computational functionality
  • Authorized guests from other research institutions working in collaboration with university employees

To learn more:

Computing Options

HPC Comparison Chart

 
Compute
 Unrestricted


Machine Learning
Unrestricted

 


OWL's Nest
Unrestricted & Sensitive

 


XSEDE

   Best for   
  • Large high-performance, shared memory compute server for interactive use
  • This server is the primary HPC resource for handling interactive calculations, applications that do not parallelize well and long-running calculations, that cannot be interrupted and restarted.
  • e.g., SAS, Stata, R, Matlab, Matematica, RStudio
  • This is the recommended resource to learn how to use Linux servers for computation, learn running, developing, testing, and benchmarking.    
  • GPU computation only
  • Two GPU compute servers for intense GPU computing (with either 4 or 8 NVidia Tesla V100 GPUs interconnected with NVlink2)
  • These servers have specifically optimized software stacks available for neural-networks and deep learning.

    - GPU Compute – Shared Access
    - DGX-1 – Exclusive Access for up to 7 days for Prod

  • This is the recommended resource for machine learning and intense GPU computing applications.    
  • High-Performance Computing Cluster for distributed parallel computing with message-passing parallelization (MPI), intra-node multi-threading (OpenMP) and high-throughput computations
  • Primarily CPU-only, with some GPUs
  • Multiple hardware queues with different capabilities (CPU cores, RAM, local storage)
  • This is the recommended resource for highly parallel computations and large scale high-throughput computations. For applications that do not scale well to at least 16 processors, or have at least 16 concurrent serial calculations, the use of the Compute server is recommended.    
  • Off Site HPC - Extreme Science & Engineering Discovery Environment
  • NSF-funded virtual organization that integrates and coordinates the sharing of advanced digital services - including supercomputers and high-end visualization and data analysis resources 
  • Digital services providing seamless integration to NSF's high performance computing and data resources 
  • XSEDE's integrated, comprehensive suite of advanced digital services combined with other high-end facilities and campus-based resources, serve as the foundation for a national cyberinfrastructure ecosystem
  • XSEDE also provides the expertise to ensure that researchers can make the most of the supercomputers and tools   
Available Storage Space per User
   

Up to 500GB, more on request

For details and applying for an account go to www.hpc.temple.edu

Up to 1TB, more on request

For details and applying for an account go to www.hpc.temple.edu

20GB (home) / 1TB (work) / 500TB (scratch, monthly purge) / project folders on request

For details and applying for an account go to www.hpc.temple.edu

   
    

   Data Sensitivity        Unrestricted       Unrestricted       Unrestricted        
Collaborate with other users on resource Inside or Outside of Temple
  • Inside
  • Group access inside on request
  • Inside
  • Group access inside on request
  • Inside
  • Group access inside on request
 
Backup No No No  
Access from  Worldwide via SSH Worldwide via SSH Worldwide via SSH  
Data Transfer scp, sftp, rsync scp, sftp, rsync scp, sftp, rsync, Globus (GridFTP)