Revisiting the ‘P’ in HPC

21 June 2012

“Don’t ask me about the ‘P’ in HPC…” is something that I’ve said a lot over the last half decade. Era of Pervasive Computing

The article uses the phrase “the Era of Connected Devices”, but I like the phrase “the Internet of Things.” I won’t bother listing the “things,” but the posting was specifically focused on “sensors and devices that will monitor and sense our environments, collect data and provide timely and critical feedback.”

I like this quote: “Everything that will benefit from a connection will be connected.”

The following is something I wrote on 1 April 02009…

About the ‘P’ in HPC

HPC stands for High Performance Computing. Historically, the ‘P’ in HPC stood for “performance;” however, these days it stands for much more.

The following is my attempt to define HPC in one sentence.

HPC is a highly pervasive computing environment that
enables highly productive computing via a highly persistent
cyber-infrastructure that exploits highly parallel computing
to provide highly powerful computation.

With less than two years left in decade zero of the 21st century, HPC is really HP^6C — High Performance, Productivity, Pervasive, Persistent, Parallel, Powerful Computing.

HPC systems not only provide peta-scale computation, but they also provide powerful storage systems and, in many cases, powerful visualization systems (e.g. ASU’s Decision Theater).

HPC systems (which for the most part are hardware) need software and these days the software is way behind the hardware. In other words, today’s software is not even close to exploiting the power of HPC systems.


20 Petaflops Coming Soon

31 March 2012

[1 April 02012]
Three years ago, on 1 April 02009, I gave a talk titled 20 Petaflops by 02012. [Yes, I used a 5-digit year.] First quarter of 02012 has ended and as far as I know our computing world has not hit 20 petaflops, but the next TOP500 list doesn’t come out until June. Regardless, I am 99.999% confident that 20 petaflops in 02012 is going to happen primarily because of what IBM announced four months ago.

[25 November 2011]
IBM issued a press release titled IBM Announces Supercomputer to Propel Sciences Forward having the sub-title Blue Gene/Q tackles major challenges of the day, delivers up to 100 petaflops at peak performance.

“When it is fully deployed in 2012 at Lawrence Livermore National Laboratory (LLNL), the system, named ‘Sequoia’, is expected to achieve 20 petaflops at peak performance, marking it as one of the fastest supercomputers in the world.” Announces Supercomputer to Propel Sciences Forward

[11 November 2011]
The issued the following press release: Japan’s K Computer Tops 10 Petaflop/s to Stay Atop TOP500 List. Japan’s K computer was benchmarked at 10.51 petaflops.

[1 April 2009]
AzGrid::Talk::20 Petaflops in 02012

Two years earlier I gave a talk sub-titled Computing in the 21st Century. During that talk I stated the following: “The next era of computing is the era of High-Performance (Productivity) Computing (HPC).” In addition, during the talk I indicated that peta-scale computing was scheduled to occur in 02008. posted the following on 18 June 02008: “With the publication of the latest edition of the TOP500 list of the world’s most powerful supercomputers today (Wednesday, June 18), the global high performance computing community has officially entered a new realm–a supercomputer with a peak performance of more than 1 petaflop/s [1.026 petaflops]

[4 April 2007]
AzGrid::Talk::The Next Era of Computing

Petaflops processing yottabytes

17 November 2009

I tweeted the following two tweets on 02009.11.16…

  • Jaguar supercomputer hits 1.759 petaflops

    A supercomputer known as Jaguar has finally bested IBM’s Roadrunner supercomputer in the biannual TOP500 list, but researchers have already begun looking into exascale supercomputers that consist of 100 million cores and run 1,000 times faster than Jaguar.

  • Yottabytes of data via “National Security Agency’s
    Surveillance Data Could Fill Two States by 2015”

    The NSA estimates it will have enough data by 2015 to fill a million datacenters spread across the equivalent combined area of Delaware and Rhode Island. The NSA wants to store yottabytes of data, and one yottabyte comes to 1,000,000,000,000,000 GB.