Why HPC? Weather Prediction is One of the Many Whys

16 April 2012

Inevitably, when I speak about HPC (supercomputing [petaflops and exaflops], visualization systems, Infinite Computing, etc.), I am asked the following question: Why? (i.e. Why as in why do we need so many flops?) My response always starts with “weather forecasting…” with an emphasis on forecasting such things as hurricanes and tornadoes. Accurate storm predictions can save lives.

The following is a headline from the Friday, 13 April 02012, Arizona Republic: Saturday storms ‘life threatening’.

“We’re quite sure tomorrow will be a very busy and dangerous day in terms of large swathes of central and southern plains.” — National Weather Service (NOAA.gov) via the Arizona Republic

Various news sources reported the following.

National Weather Service’s Storm Prediction Center in Norman, Okla., which specializes in tornado forecasting, took the unusual step of warning people more than 24 hours in advance of a possible “high-end, life-threatening event.”

The predictions ended up being extremely accurate: Tornadoes hit the midwest part of the United States hard on Saturday and Sunday.

The accuracy of weather forecasting is important because it can save lives. But right now the accuracy is critically important because of the need to establish trust among the populous.


20 Petaflops Coming Soon

31 March 2012

[1 April 02012]
Three years ago, on 1 April 02009, I gave a talk titled 20 Petaflops by 02012. [Yes, I used a 5-digit year.] First quarter of 02012 has ended and as far as I know our computing world has not hit 20 petaflops, but the next TOP500 list doesn’t come out until June. Regardless, I am 99.999% confident that 20 petaflops in 02012 is going to happen primarily because of what IBM announced four months ago.

[25 November 2011]
IBM issued a press release titled IBM Announces Supercomputer to Propel Sciences Forward having the sub-title Blue Gene/Q tackles major challenges of the day, delivers up to 100 petaflops at peak performance.

“When it is fully deployed in 2012 at Lawrence Livermore National Laboratory (LLNL), the system, named ‘Sequoia’, is expected to achieve 20 petaflops at peak performance, marking it as one of the fastest supercomputers in the world.”

IBM.com::IBM Announces Supercomputer to Propel Sciences Forward

[11 November 2011]
The TOP500.org issued the following press release: Japan’s K Computer Tops 10 Petaflop/s to Stay Atop TOP500 List. Japan’s K computer was benchmarked at 10.51 petaflops.

[1 April 2009]
AzGrid::Talk::20 Petaflops in 02012

Two years earlier I gave a talk sub-titled Computing in the 21st Century. During that talk I stated the following: “The next era of computing is the era of High-Performance (Productivity) Computing (HPC).” In addition, during the talk I indicated that peta-scale computing was scheduled to occur in 02008. TOP500.org posted the following on 18 June 02008: “With the publication of the latest edition of the TOP500 list of the world’s most powerful supercomputers today (Wednesday, June 18), the global high performance computing community has officially entered a new realm–a supercomputer with a peak performance of more than 1 petaflop/s [1.026 petaflops]

[4 April 2007]
AzGrid::Talk::The Next Era of Computing


Three Tweets by @compufoo

25 October 2010

On 25 October 02010, the @compufoo Twitter account had 515 tweets, 21 followers, and was following zero.

The @compufoo Twitter account was setup to support my “Computer Science For Non-CS Majors” class. I asked the students to follow @compufoo, but I did not require them to do so. More than half of the students were not Twitter users; consequently, only about half of the class started following @compufoo.

The following are the last three tweets tweeted by @compufoo prior to writing this blog posting.

[02010.10.25] Computing students should follow Dan Reed. RT @HPCDan HPC and the Excluded Middle http://bit.ly/dj0B8s

Dan Reed is a supercomputing guru. In 02006, President George W. Bush appointed Dan Reed to the President’s Council of Advisors on Science and Technology (PCAST).

[02010.10.25]“Dawn of a New Day” by Ray Ozzie http://goo.gl/ti6w via @robinwauters & @techcrunch

On 18 October 02010, Ray Ozzie — one of the creators of Lotus Notes — stepped down as Microsoft’s Chief Software Architect.

[02010.10.23]What will the Internet look like in 10 years? http://www.isoc.org/tools/blogs/scenarios/

The Internet Society (ISOC.org) is a non-profit that was founded in 01992 to “provide leadership in Internet related standards, education, and policy.”


One millionth of one percent

8 February 2010

The following quote was on my iGoogle homepage on 02010.02.03.

“We don’t know a millionth of one percent about anything.” — Thomas Edison (d.01931)

Edison’s quote prompted me to come up with the following arithmetic question: What is one millionth of one percent of 1.759 quadrillion?

I used 1.759 quadrillion because the world’s fastest supercomputer could do that many arithmetic calculations in one second (flops).
And, by the way, one millionth of one percent of 1.759 quadrillion is 17,590,000.

In the 21st century, Edison might have said that we don’t know a quadrillionth of one percent about anything.


From 140 to 1,759,000,000,000,000

2 December 2009

This semester I have been hammering away at two themes: Twitter and High-Performance Computing.

Twitter is a micro-blogging service that limits postings (i.e. tweets) to a maximum of 140 characters.

During November 02009, TOP500 issued its list of the world’s 500 fastest supercomputers and Jaguar was number at 1.759 petaflops (i.e. 1759 trillion floating-point operations per second).

The lecture notes for weeks 13 (02009.11.16) and 15 (02009.12.01) contained the following tweets.

The Global Language Monitor names “Twitter” the top word of 2009.

I’ve been telling students that I am in the process of learning about Twitter. For me, as of 02009.12.01, the power of Twitter is in who I follow.

President Obama visited China and he had Twitter on his mind.

“First of all, let me say that I have never used Twitter.”~Obama to Chinese

I don’t know why Obama had to let the Chinese in on the fact that had never tweeted.

“I’m a big supporter of not restricting Internet use, Internet access, other information technologies like Twitter.”~Obama to Chinese

Obama referred to Twitter as a form of “information technology” and these days I call this 21st century Informatics. In a nutshell, 21st century Informatics is supercomputer-based data processing.

Al Gore was the keynote speaker at the 21st annual SC conference. SC is an “international conference on High Performance Computing (HPC), networking, storage and analysis.”

At SC09, Al Gore says supercomputing can be killer app in climate change.

Gore believes high-performance computing systems, which include high-performance visualization systems, will help convince the world that climate change is a real problem. Gore might be wise to expect the unexpected.

“Supercomputing has given us the most powerful tool in the history of civilization.”~Al Gore at SC09

A bold statement by Gore and only time will tell if he is correct.

By the way, Steve Wozniak once said, “Never trust a computer you can’t throw out a window.”


Snoop Dogg, Aristotle, Donald Knuth

21 November 2009

On back-to-back days I mined quotes by Calvin Broadus (Snoop Dogg)
and Aristotle. Dogg’s quote was about math and Aristotle’s quote was about teaching. The combination of math and teaching caused me to recall a quote by Donald Knuth about the importance of computing.

If you stop at general math, you’re only going to make general math money. — Snoop Dogg

I’m curious as to how Dogg defines “general math money.” Minimum wage? Less than six-figure salaries? Less than $1 million per year? I suspect Dogg makes abstract algebra money.

Teaching is the highest form of understanding. — Aristotle

The quotes by Dogg and Aristotle reminded me of the following quote by Donald Knuth.

It has often been said that a person does not really understand something until he teaches it to someone else. Actually a person does not really understand something until he can teach it to a computer, i. e., express it as an algorithm. The attempt to formalize things as algorithms leads to a much deeper understanding than if we simply try to comprehend things in the traditional way. — Donald Knuth [1]

Programming is how a person “teaches” a computer, yet students don’t have to learn about programming in K-12. And, many (majority of?) students get college degrees without ever learning a programming language. We are living in the CSTEM era and 21st century STEM depends on Computing, yet our educational systems seem to ignore this reality.

Aristotle might have been a dude in his day, but he didn’t have supercomputers at his finger tips. Knuth is a grossly unknown modern day polymath who would be quickly whatevered by most young people. But what about Snoop?

Snoop Dogg tells his fans to learn beyond general math. Kudos to Dogg. It would be nice if Mr. Dogg would rap about the importance of learning about the base-2 number system (i.e. the code).

[1] Donald Knuth is a “computer scientist and Professor Emeritus of the Art of Computer Programming at Stanford University.” [Wikipedia]


Petaflops processing yottabytes

17 November 2009

I tweeted the following two tweets on 02009.11.16…

  • Jaguar supercomputer hits 1.759 petaflops http://bit.ly/3eInDv

    A supercomputer known as Jaguar has finally bested IBM’s Roadrunner supercomputer in the biannual TOP500 list, but researchers have already begun looking into exascale supercomputers that consist of 100 million cores and run 1,000 times faster than Jaguar.

  • Yottabytes of data via PopSci.com “National Security Agency’s
    Surveillance Data Could Fill Two States by 2015” http://ow.ly/CRGE

    The NSA estimates it will have enough data by 2015 to fill a million datacenters spread across the equivalent combined area of Delaware and Rhode Island. The NSA wants to store yottabytes of data, and one yottabyte comes to 1,000,000,000,000,000 GB.