Logo of Science Foundation Ireland  Logo of the Higher Education Authority, Ireland7 CapacitiesGPGPU Research Projects
Ireland's High-Performance Computing Centre | ICHEC
Home | News | Infrastructure | Outreach | Services | Research | Support | Education & Training | Consultancy | About Us | Login

Stoney Supercomputer


Stoney (stoney.ichec.ie) is a Bull Novascale R422-E2 cluster with 64 compute nodes. Each compute node has two 2.8GHz Intel (Nehalem EP) Xeon X5560 quad-core processors and 48GB of RAM. This results in a total of 496 cores and 2976GB of RAM available for jobs.

The nodes are interconnected via a half-blocking fat-tree network of ConnectX Infiniband (DDR) providing high bandwidth and low latency for both computational communications and storage access. To maximise interconnect performance, the resource manager is configured to provide fully non-blocking node allocations to jobs. Storage is provided by an EMC CLARiiON CX4-120 SAN with 21TB (formatted) of capacity to the compute nodes via two Lustre filesystems. Each compute node also provides 870GB of local scratch capacity on a directly attached hard disk.

Stoney also provides ICHEC's National GPU Service with 24 of the compute nodes reserved for GPGPU computing. These nodes have two NVIDIA Tesla M2090 cards installed, with each card providing 512 GPU cores, 6GB of local GDDR5 memory and a theoretical peak double-precision performance of 665GFlops.

In addition to the compute nodes a set of service, administrative and storage nodes provide user login, batch scheduling, management, the Lustre filesystems, etc.

Stoney is cooled by high-efficiency cool cabinet doors provided by Bull. These doors are mounted directly to the rear of the racks and maximise energy efficiency by providing chilled water heat exchange close to the source.

Stoney was funded under the PRTLI Cycle 4 funded project e-INIS. This grant was awarded to NUI Galway who in turn provide the system for national use managed by ICHEC. As part of this agreement NUI Galway owns a percentage of system time.


A large number of scientific software packages, libraries, compilers and other tools are installed and maintained on Stoney by ICHEC. A full listing of these will be made available on the software page.

Scheduling Policy

The following queues are available on Stoney:

  Job Size (cores) Maximum Walltime Maximum Available Cores
ShortQ 128 84 hours 316
ProdQ 128 48 hours 316
LongQ 8 240 hours 316
GpuQ 64 24 hours 192


  • Users should avoid specifying a queue in their PBS scripts. Specifying the walltime and processor requirements will result in the job being routed to the optimum queue.
  • Hard and soft throttling limits are in place to prevent a single user from flooding the entire system.
  • Queued jobs gain priority at different rates depending on many factors.


Introductory documentation for Stoney will be added in our documentation section.


Some photos of the Stoney system: