It has taken a year and $50-million to put together, and its brain takes up as much room as a warehouse full of refrigerators.
Today, the monster finally opens its eyes, as the University of Toronto's newest supercomputer - the fastest such machine in Canada - goes online.
There's no shortage of beastly metrics by which this computer's power can be measured: It can perform more than 300 trillion calculations a second, simulate the Earth's climate 100 years into the future in four days and help researchers study cosmic background radiation, a calculation-intensive task that offers a glimpse into what the universe looked like 13 billion years ago.
"This positions us on a world research stage at a whole new level," said Chris Pratt, strategic initiatives executive at IBM Canada. "This isn't one step or two steps; this is like, 'Wow.'"
A small part of the IBM System x iDataPlex server has been operating since late last year, humming away in a Vaughan-area warehouse. But today, the machine's full power is unleashed.
Almost everything about the system sounds improbable.
It uses the same amount of energy, at peak consumption, as 4,000 homes. It is about 30 times more powerful than the next-fastest research computer in Canada. It can whirl data through its digital veins at the rate equal to about two DVD movies a second. It is among the 15 fastest computers in the world, and the fastest outside the United States.
Or think of it this way: If you've purchased a decent home computer lately, it may have come with a relatively fast 2.53 gigahertz processor. Or maybe you shelled out more for a fast, top-of-the-line "quad core" system, which runs on four such processors.
U of T's new toy runs on 30,240 of them.
For a system that'll suck up at least $1-million worth of energy a year, it seems odd to describe the computer as energy efficient, but, in the superlative-laden world of supercomputers, it is.
The computer monitors its individual units continuously, and shuts off any that aren't in use for more than about 10 minutes.
IBM's designers also came up with what seems like a pretty obvious way to reduce cooling costs: let the Canadian winter do it. Every time the outdoor temperature drops below a certain point, the computer uses the cold from the air to chill out. In total, IBM estimates the same supercomputer's energy use would be equivalent to that of about 750 more homes if it were designed using traditional methods.
"It's a really impressive, world-class facility," said Chris Loken, the chief technical officer for the SciNet consortium, which is responsible for getting the supercomputer built. "It's going to give a lot of scientists access to some really powerful resources."
The supercomputer's uses, Mr. Loken notes, vary from planetary physics to aerospace research to medicine. For example, the machine will act as a data centre for the Large Hadron Collider - the world's largest atom smasher, located underground near Geneva - which can generate more than 40-million collisions a second. Researchers are already using some of the computer's capacity to study cosmic background radiation.
But even for a computer as powerful and expensive as this one, its reign among the world's heavyweights will be short-lived. Both Mr. Loken and Mr. Pratt admit that, given the pace at which technology is moving, U of T's supercomputer will be eclipsed by several faster machines in the next few years.
"It will not be in the top 20 systems next year," Mr. Loken said. "But there's still going to be a lot of awfully good research that can be done on it.
"There's really good science being done now on number 100 and number 500."
TOP 5 SUPERCOMPUTERS AROUND THE WORLD
- Location: Los Alamos National Laboratory, New Mexico
- Speed: 1105.00 Teraflops
- Location: Oak Ridge National Laboratory, Tennessee
- Speed: 1059.00 Teraflops
- Location: NASA Advanced Supercomputing Division, California
- Speed: 487.01 teraflops
- Location: Lawrence Livermore National Laboratory, California
- Speed: 478.20 teraflops
- Location: Argonne National Laboratory, Illinois
- Speed: 450.30 teraflops
Source: Top 500 supercomputer sites for November 2008