It's interesting to read articles like this since I came from bizarro not-so-HPC world where "we" (academic department) didn't pay for electricity (the university did!), so there was no incentive at all to retire obsolete hardware. Right up until I left many months ago, I was running jobs on a cluster made up of 84 servers on death's door, each with dual-processor (not dual-core!) Nocona Xeons or Opteron 240s.
"Oh, but there's a cost to support obsolete hardware!" Yeah, sure, but the person supporting everything was me, and I was a constant cost to keep around whether I supported crappy obsolete hardware or shiny new hardware.
"At more than one quadrillion floating point operations per second..." - how fast is that when cracking typical passwords or mining bitcoins? If we have an encrypted drive, how long would it take to find the passphrase with it?
Well, that's four years. Only a year ahead of normal amortization schedules.
How many years till everyone's cell phone has this much computing power?
RIP Cell processor.
flops = floating operations per second, why all this tech articles keep using terms like "petaflop"? It is not plural, it doesn't make sense.
What better way to spend the public dime than this.
Oh I wish I have a supercomputer and could play with one right now.
Something about these numbers doesn't quite make sense. The reason cited for dismantling the machine is that "it isn't energy-efficient enough to make the power bill worth it." But the supercomputer uses 2345 kilowatts, which at US prices of around 15 cents per kWh would cost $352 / hour to run in energy costs. By comparison, the $120 million cost of building roadrunner, amortized over the four years it's been running, comes out to $3400 / hour. The article makes it sound like the power bill is costing them a fortune, at $3 million a year, it isn't that much at all next to the $120 million price tag.