The US Department of Energy is planning to install 100+ petaflop supercomputers within the next four years as the famous national laboratories at Oak Ridge, Argonne and Lawrence Livermore upgrade to more powerful machines.
The labs, which are working together under the CORAL (Collaboration Oak Ridge Argonne Livermore) umbrella, will be issuing requests for proposal later this year, after posting a request for information in December 2012.
By grouping together - as the National Energy Research Scientific Computing Center (NERSC), Los Alamos and Sandia National Laboratories did in an earlier round of supercomputer purchases - the research labs are hoping to drive down costs, while saving on the time and resources invested into procurement.
The specifications for the CORAL supercomputers will initially require performance of between 100 and 300 petaflops, as well as between five and 10 petabytes of memory, and up to 150 petabytes of storage.
The contest is likely to be fought out between IBM and Cray, but both would need to deploy a design and microprocessor technology still under development. At the same time, though, a spoken aim of the task is to stimulate further research and development into such technology. Petaflop computing is expected to become increasingly common towards the end of the decade, and ought to feature technology that could also be pushed into exascale computing, according to US supercomputer journal, HPCwire.
When the procurement is complete and the new machines installed, they will be about 10 times more powerful than the facilities' fastest existing machine, Titan, which is capable of 24 peak petaflops and is currently the most powerful computer in the world.
The computers will be used to support research into molecular dynamics, cosmology, computational fluid dynamic combustion and other uses in line with the "missions" of the Office of Science and US National Nuclear Security Administration.
Successful leaders are infusing analytics throughout their organisations to drive smarter decisions, enable faster actions and optimise outcomes
Focus on cost efficiency, simplicity, performance, scalability and future-readiness when architecting your data protection strategy