A Couple Turing Test Moments

25 April 2012

I use a service called Timehop that sends me an email message everyday containing the tweets that my @nanofoo character tweeted one year ago. Today, 25 April 02012, Timehop reminded me that I tweeted the following on 25 April 02011.

#TuringTest RT @factlets: Software produced original music in style of great composers fools experts. http://factlets.info/SyntheticMusic

Notice the use of the #TuringTest hashtag in the tweet.

Yesterday, my @compufoo character tweeted the following.

#TuringTest RT @MachinesLikeUs Can computers pass as human? http://goo.gl/fb/uP6HW

Notice the use of the #TuringTest hashtag in the tweet.

LongBets.org::#1::By 2029 no computer – or “machine intelligence” – will have passed the Turing Test


Why HPC? Weather Prediction is One of the Many Whys

16 April 2012

Inevitably, when I speak about HPC (supercomputing [petaflops and exaflops], visualization systems, Infinite Computing, etc.), I am asked the following question: Why? (i.e. Why as in why do we need so many flops?) My response always starts with “weather forecasting…” with an emphasis on forecasting such things as hurricanes and tornadoes. Accurate storm predictions can save lives.

The following is a headline from the Friday, 13 April 02012, Arizona Republic: Saturday storms ‘life threatening’.

“We’re quite sure tomorrow will be a very busy and dangerous day in terms of large swathes of central and southern plains.” — National Weather Service (NOAA.gov) via the Arizona Republic

Various news sources reported the following.

National Weather Service’s Storm Prediction Center in Norman, Okla., which specializes in tornado forecasting, took the unusual step of warning people more than 24 hours in advance of a possible “high-end, life-threatening event.”

The predictions ended up being extremely accurate: Tornadoes hit the midwest part of the United States hard on Saturday and Sunday.

The accuracy of weather forecasting is important because it can save lives. But right now the accuracy is critically important because of the need to establish trust among the populous.


The ‘C’ in SCREAMers

13 April 2012

The STEM and STEAM acronyms have become popular acronyms here in the early part of the 21st century. [If I’ve said this n times, I’ve seen it ++n times.]

I’ve never liked the STEM acronym. The first time I saw it I immediately asked “Where’s the Computing?” [And this is said out loud mimicking the way the old Wendy’s lady said “Where’s the Beef?” in those old Wendy’s commercials.] The same “Where’s the Computing?” question applies to the STEAM acronym. Sometime not that long ago I subjected the STEM and STEAM acronyms to the following question: “Where’s the Robotics?”

21st STEM and STEAM depend on Computing, so I originally proposed changing STEM and STEAM to CSTEM and CSTEAM, respectively. There three immediate problems: (0) CSTEM and CSTEAM are not really acronyms. (1) STEM and STEAM are too embedded in our society to change them (i.e. they’re immutable). (2) Where’s the Robotics?

Problem (1) might be impossible to repair, so I’m going to ignore that it exists. Problems (0) and (2) are eliminated with use of the SCREAM acronym. Let the Technology morph into technologies and bury it in the sciences (e.g. biotechnology and nanotechnology), the computing, the robotics, the engineering, the art and the mathematics.

I recently used STEMers and STEAMers to refer to scientists, technologists, engineers, artists, and mathematicians. SCREAMers include those plus roboticists and… oops… computerists? computists? compueers? computicians? computerologists? In those infamous grunts of Homer Simpson… D’oh! Hmm… It would be fun to be able to rewind to when there were no non-human computers and refer to the ‘C’ in SCREAMers as computers. SCREAMers are scientists, computers [humans], roboticists, engineers, artists, and mathematicians.

The ‘C’omputing in SCREAM includes both human and non-human computers. 21st century STEM, STEAM and SCREAM depend on all of us being “computers.”

The following was copied from Wikipedia.org…

The first use of the word “computer” was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations.


SKA Radio Telescope I Get, But Why Megadata?

12 April 2012

DOME: IBM and ASTRON’s Exascale Computer for SKA Radio Telescope contains a picture that in turn contains the word megadata. My last posting focused on an article that used the word nanodata. I just didn’t get what nanodata meant, but I sort of understand megadata. The following is a quote from the SingularityWeblog.com posting:

The SKA is an international consortium to build the world’s largest and most sensitive radio telescope. Scientists estimate that the processing power required to operate the telescope will be equal to 100 million of today’s fastest desktop computers.

I understand the use of megadata to describe lots of data, but I would have used yottadata.

Borrowing from Pink Floyd’s “Comfortably Numb”… Is there anybody out there?


Ubiquitous Sensing I Get, But What’s Nanodata?

11 April 2012

The Computing Trend that Will Change Everything had the sub-title “Computing isn’t just getting cheaper. It’s becoming more energy efficient. That means a world populated by ubiquitous sensors and streams of nanodata.

Ubiquitous sensors imply streams of data. That I get. But what’s nanodata?

Harvesting background energy flows, including ambient light, motion, or heat, opens up the possibility of mobile sensors operating indefinitely with no external power source, and that means an explosion of available data.

An “explosion of data” implies to me yottadata (as in yottagoo). Again, what’s nanodata?

According the MIT Technology Review article, nanodata is “customized fine-grained data describing in detail the characteristics of individuals, transactions, and information flows.” To me it seems as if nanodata is a form of metadata (i.e. data about data).

I still don’t get the term nanodata, but I consider that okay. Bottom-line: It’s possible ubiquitous sensors is our future and that implies infinite data being piped into an Infinite Computing environment.


20 Petaflops Coming Soon

31 March 2012

[1 April 02012]
Three years ago, on 1 April 02009, I gave a talk titled 20 Petaflops by 02012. [Yes, I used a 5-digit year.] First quarter of 02012 has ended and as far as I know our computing world has not hit 20 petaflops, but the next TOP500 list doesn’t come out until June. Regardless, I am 99.999% confident that 20 petaflops in 02012 is going to happen primarily because of what IBM announced four months ago.

[25 November 2011]
IBM issued a press release titled IBM Announces Supercomputer to Propel Sciences Forward having the sub-title Blue Gene/Q tackles major challenges of the day, delivers up to 100 petaflops at peak performance.

“When it is fully deployed in 2012 at Lawrence Livermore National Laboratory (LLNL), the system, named ‘Sequoia’, is expected to achieve 20 petaflops at peak performance, marking it as one of the fastest supercomputers in the world.”

IBM.com::IBM Announces Supercomputer to Propel Sciences Forward

[11 November 2011]
The TOP500.org issued the following press release: Japan’s K Computer Tops 10 Petaflop/s to Stay Atop TOP500 List. Japan’s K computer was benchmarked at 10.51 petaflops.

[1 April 2009]
AzGrid::Talk::20 Petaflops in 02012

Two years earlier I gave a talk sub-titled Computing in the 21st Century. During that talk I stated the following: “The next era of computing is the era of High-Performance (Productivity) Computing (HPC).” In addition, during the talk I indicated that peta-scale computing was scheduled to occur in 02008. TOP500.org posted the following on 18 June 02008: “With the publication of the latest edition of the TOP500 list of the world’s most powerful supercomputers today (Wednesday, June 18), the global high performance computing community has officially entered a new realm–a supercomputer with a peak performance of more than 1 petaflop/s [1.026 petaflops]

[4 April 2007]
AzGrid::Talk::The Next Era of Computing


Infinite Computing Deprecates the Delete Function

26 March 2012

On 26 March 02012 I discovered that Facebook has a Usenet page, so I “Liked” it. My curiosity prompted me to do some Usenet searching and I found a net.music posting that I made almost 30 years ago on 5/8/01982. http://goo.gl/16mNL

Quoting Voltaire: “Doubt is not a pleasant condition, but certainty is absurd.”

I’m going to ignore Voltaire and say that I’m 99.999% certain that we have entered into the “era” of “Infinite Computing.” [I put quotes around “era” because to me an era has an endpoint.] To me infinite computing implies infinite storage and infinite storage deprecates the “delete” function. In other words, a bit, once posted, is never deleted. We can click the DELETE key over and over and over, but being certain that our bits have been deleted might be absurd.

About “Infinite Computing”

On 4 April 02007 I gave a talk titled The Next Era of Computing: Computing in the 21st Century. The talk included the following blurb.

The following is my one sentence description of the next era of computing: A grid-based cyber-infrastructure that provides infinite computational power, infinite storage, infinite bandwidth and infinite services (utilities).

The phrase Infinite Computing is used in the book Abundance: The Future is Better Than You Think (Copyright 02012 by Peter H. Diamandis and Steven Kotler).