The Future: Genetic Engineering, Nanotechnology, Robotics

29 July 2012

“The Future is genetic engineering, nanotechnology, and, ultimately, robotics,” says Ray Kurzweil.

TEDxNYU – Christopher Bradley – Synthetic Biology: This Will Change Everything“We’re using genes as programming languages,” says Christoper Bradley. “Biology plus engineering equals synthetic biology.”

Empire of the Ants is a good bad movie. At one point while watching it I thought nanoants. An ant would be a great way to introduce the topic of nanotechnology. Most people have seen ants. Ants are small, but we can see them with our naked eyes. A nanoant is one-billionth of an ant. In otherwords, take an ant and cut it into a billion equally sized pieces.

Most realistic android infant?“They plan to add voice, body temperature, skin, body movements, and smell,” says

Revisiting the ‘P’ in HPC

21 June 2012

“Don’t ask me about the ‘P’ in HPC…” is something that I’ve said a lot over the last half decade. Era of Pervasive Computing

The article uses the phrase “the Era of Connected Devices”, but I like the phrase “the Internet of Things.” I won’t bother listing the “things,” but the posting was specifically focused on “sensors and devices that will monitor and sense our environments, collect data and provide timely and critical feedback.”

I like this quote: “Everything that will benefit from a connection will be connected.”

The following is something I wrote on 1 April 02009…

About the ‘P’ in HPC

HPC stands for High Performance Computing. Historically, the ‘P’ in HPC stood for “performance;” however, these days it stands for much more.

The following is my attempt to define HPC in one sentence.

HPC is a highly pervasive computing environment that
enables highly productive computing via a highly persistent
cyber-infrastructure that exploits highly parallel computing
to provide highly powerful computation.

With less than two years left in decade zero of the 21st century, HPC is really HP^6C — High Performance, Productivity, Pervasive, Persistent, Parallel, Powerful Computing.

HPC systems not only provide peta-scale computation, but they also provide powerful storage systems and, in many cases, powerful visualization systems (e.g. ASU’s Decision Theater).

HPC systems (which for the most part are hardware) need software and these days the software is way behind the hardware. In other words, today’s software is not even close to exploiting the power of HPC systems.

The ‘C’ in SCREAMers

13 April 2012

The STEM and STEAM acronyms have become popular acronyms here in the early part of the 21st century. [If I’ve said this n times, I’ve seen it ++n times.]

I’ve never liked the STEM acronym. The first time I saw it I immediately asked “Where’s the Computing?” [And this is said out loud mimicking the way the old Wendy’s lady said “Where’s the Beef?” in those old Wendy’s commercials.] The same “Where’s the Computing?” question applies to the STEAM acronym. Sometime not that long ago I subjected the STEM and STEAM acronyms to the following question: “Where’s the Robotics?”

21st STEM and STEAM depend on Computing, so I originally proposed changing STEM and STEAM to CSTEM and CSTEAM, respectively. There three immediate problems: (0) CSTEM and CSTEAM are not really acronyms. (1) STEM and STEAM are too embedded in our society to change them (i.e. they’re immutable). (2) Where’s the Robotics?

Problem (1) might be impossible to repair, so I’m going to ignore that it exists. Problems (0) and (2) are eliminated with use of the SCREAM acronym. Let the Technology morph into technologies and bury it in the sciences (e.g. biotechnology and nanotechnology), the computing, the robotics, the engineering, the art and the mathematics.

I recently used STEMers and STEAMers to refer to scientists, technologists, engineers, artists, and mathematicians. SCREAMers include those plus roboticists and… oops… computerists? computists? compueers? computicians? computerologists? In those infamous grunts of Homer Simpson… D’oh! Hmm… It would be fun to be able to rewind to when there were no non-human computers and refer to the ‘C’ in SCREAMers as computers. SCREAMers are scientists, computers [humans], roboticists, engineers, artists, and mathematicians.

The ‘C’omputing in SCREAM includes both human and non-human computers. 21st century STEM, STEAM and SCREAM depend on all of us being “computers.”

The following was copied from…

The first use of the word “computer” was recorded in 1613, referring to a person who carried out calculations, or computations, and the word continued with the same meaning until the middle of the 20th century. From the end of the 19th century the word began to take on its more familiar meaning, a machine that carries out computations.

Miscellaneous Geekiness on 11/01/10

1 November 2010

Binary date and time… If we’re naughty and use a 2-digit year, then 11/01/10 (today) is a binary date. In the DD/MM/YY format 01/11/10 is palindromic. [Note: DD/MM/YY format is popular outside of the United States.] I’m looking forward to 11/10/10 and 11/11/10, which will be the last binary dates until 01/01/11.

Pi sighting… The Arizona Republic reported that Arizona has “3.14 million registered voters.” 3.14 is Pi rounded to the nearest hundredth.

Nano moment… I watched “The Tingler” on Halloween. During the movie Dr. Chapin said he was going to inject himself with “100 micro milligrams” of what I think was LSD. “The Tingler” was made in 01958. If it had been made in 02010, Dr. Chapin might have said he was
going to inject 100 nanograms of drug.

Tweet milestone… On 02010.10.30 my @mathbabbler tweet counter hit one kilotweet. It took 509 days to reach that milestone.

Date and time I told them so… Last week during my “Computing Science for Non-CS Majors” course I giving an overview of the functions that come with STDC (Standard C) Library. I mentioned that C and C++ don’t provide direct support for date and time processing, but that the STDC Library comes with a collection of date and time functions. I told that class that I would be covering some of the date and time functions because programming date and time is hard. Today, on 1 November 02010, @slashdot tweeted the following.

iPhone Alarm Bug Leads To Mass European Sleep-in

Three Tweets by @compufoo

25 October 2010

On 25 October 02010, the @compufoo Twitter account had 515 tweets, 21 followers, and was following zero.

The @compufoo Twitter account was setup to support my “Computer Science For Non-CS Majors” class. I asked the students to follow @compufoo, but I did not require them to do so. More than half of the students were not Twitter users; consequently, only about half of the class started following @compufoo.

The following are the last three tweets tweeted by @compufoo prior to writing this blog posting.

[02010.10.25] Computing students should follow Dan Reed. RT @HPCDan HPC and the Excluded Middle

Dan Reed is a supercomputing guru. In 02006, President George W. Bush appointed Dan Reed to the President’s Council of Advisors on Science and Technology (PCAST).

[02010.10.25]“Dawn of a New Day” by Ray Ozzie via @robinwauters & @techcrunch

On 18 October 02010, Ray Ozzie — one of the creators of Lotus Notes — stepped down as Microsoft’s Chief Software Architect.

[02010.10.23]What will the Internet look like in 10 years?

The Internet Society ( is a non-profit that was founded in 01992 to “provide leadership in Internet related standards, education, and policy.”

According to Two Is the Magic Number

23 September 2010

Two Is the Magic Number is a article by Joshua Wolf Shenk
that “introduces a series on creative pairs.”

The phrases “two is the magic number” and “creative pairs” prompted me to write this blog posting.

The computing world has had its fair share of creative pairs. Some examples that immediately came to mind: Hewlett and Packard; Thompson and Ritchie; Jobs and Wozniak; Brin and Page.

The following is from Ken Thompson’s 1984 ACM Turing Award Lecture titled Reflections On Trusting Trust.

“That brings me to Dennis Ritchie. Our collaboration has been a thing of beauty. In the ten years that we have worked together, I can recall only one case of miscoordination of work. On that occasion, I discovered that we both had written the same 20-line assembly language program. I compared the sources and was astounded to find that they matched character-for-character. The result of our work together has been far greater than the work that we each contributed.”

Brian Kernighan’s story about the creation of awk has to do with two being the magic number.

“awk dates from, I think, 1977. It’s by far the biggest software project that I have been involved with. There were three of us in that, and that’s completely unworkable. Somehow, it’s much easier working with two rather than three. It’s harder to split things. There’s more divergence of opinion, sometimes that’s good because it means that the more things there but sometimes it means that it’s not as cohesive as it might be. On the other hand it was very very nice to work with Al Aho and Peter Weinberger so I had no problem with that.”

I had an Kernighan-awk-like experience while working at Discount Tire Company. Steve and I were the two “store system” programmers and our work-lives were good. It was easy to split the interesting work. Later, the company felt a third programmer was needed and almost immediately after Cliff’s arrival the quality of our work-lives took a hit. I failed at mentoring Cliff and splitting the interesting work three ways. After a short span of time Cliff got frustrated and left the company. After Cliff’s exit, Steve and I returned to our pleasant work-lives.

One last item about “creative pairs” and two being the “magic number.” Pair programming became popular about a decade ago. Pair programming in a nutshell: Two programmers share one keyboard. If the pairings are made correctly, then the quality of the software increases with no decrease in productivity.

Shenk’s article contained the equation “1 + 1 = Infinity” and in some instances it just might be true.

Chapters 0-2 of “Go To” by Steve Lohr

1 September 2010

I was moving some books from one pile to another pile and one of the books was Steve Lohr’s Go To (published in 2001). Instead of tossing the book into a new pile, I decided to give it a quick re-read and it has been an enjoyable allocation of time.

From the front cover: Go To is “the story of the Math Majors, Bridge Players, Engineers, Chess Wizards, Maverick Scientists and Iconoclasts–The Programmers who created the software revolution.”

On the back cover is the following quote from then Boston Sunday Globe: “Go To is an enlightening read and does a fine job of demonstrating the power of imagination. If you can imagine it and code it, you can indeed change the world.”

Page 5: Lohr quoted Maurice Wilkes: “By June 1949 people had begun to realize that it was not easy to get a program right as had at one time appeared.” Later, while laboring on his first non-trivial program, Wilkes remembered that “the realization came over me with full force that a good part of the remainder of my life was going to be spent finding errors in my own programs.” In a nutshell, creating non-trivial programs has gotten easier over time, but it’s still hard.

Page 6: Lohr wrote about Robert Bemer. I know Bemer as the father of the ESC key, but during the 1950s he was a manager at IBM. During the 1950s, the computing world was focused on hardware instead of software, yet Bemer need programmers and they were not easy to find. Lohr used the following Bemer quote: “I once decided to advertise for chess players because I thought they would be pretty good programmers.” In April of 2006, John Seely Brown wrote an article for Wired News titled: “You Play World of Warcraft? You’re Hired!”

Page 7: Lohr describes programmers as the “artisans, craftsmen, brick layers, and architects of the Information Age.” Although Lohr wrote this in 2001, it’s still true a decade later.

Page 8: I liked how Lohr described software as “the mediator between man and machine.”

Page 9: Lohr wrote about Donald Knuth. I introduce Knuth to my students in the CS1 class as being one of the greatest computer scientists alive today. [And I always show them Knuth’s collection of Diamond Signs.] Lohr quoted Knuth saying: “There are a certain percentage of undergraduates — perhaps two percent or so — who have the mental quirks that make them good at computer programming…. The two percent are the only ones who are really going to make these machines do amazing things. I wish it weren’t so, but that is the way it has always been.”

Page 11 is the start of chapter 2 titled Fortran: The Early “Turning Point”. When I mention Fortran during class, I usually mention the following: Fortran (formula translation) was created by John Backus and it was born in 1957 (the same year as my birth and the same year the Russians launched Sputnik). Lohr quoted Ken Thompson stating: “Ninety-five percent of he people who programmed in the early years would never have done it without Fortran. It was a massive step.” These days Fortran remains popular thanks to supercomputing. For example, a student might write programs in Fortran if they take a scientific computing class at Arizona State University (this was true a couple of years ago). I searched my website for Backus quotes and found the following: “Conventional [programming] languages are basically high-level, complex versions of the von Neumann computer.” Backus died in 2007 and that is when I posted the quote.

Pages 20 and 21: Lohr gets into the topic of number systems. From base-10 (decimal) to base-2 (binary) to base-8 (octal) to base-16 (hexadecimal). [Note: I wrote this at 10:14am MST on 31 August 2010. At about 3:30pm MST on the same day, I covered the topic of number systems in my CS1 class.]

Page 23: Lohr quotes Grace Hopper saying, “There had to be an interface built that would accept things that were people-oriented and the use the computer to translate to machine code.” [hint: compilers]

Page 24: Lohr mentions how Fortran had a goto statement. A decade later, in 1968, Edsger Dijkstra wrote a paper titled “Go To Statement Considered Harmful.” Lohr shares Dijkstra’s observation that the “quality of programmers is a decreasing function of the density of Go To statements in the programs they produce.” One of the things I have learned while helping others learn about programming is that if you teach them goto, they will use it. With respect to goto, I’m okay with people using it, but I don’t program it directly. [Note: goto comes in handy when you write programs that write programs.]

Page 35 is the start of chapter three titled “The Hard Lessons of the Sixties: From Exuberance to the Realities of COBOL and the IBM 360 Project” and chaper four titled “Breaking Big Iron’s Grip: Unix and C” begins in page 63. I’ll add a hyperlink to “Chapters 3-4 of Go To by Steve Lohr when I get it completed.

KKK (Kapor, Kurzweil, Kay)

26 August 2010

02010.08.26: Day two of the Introduction to Computer Science class.

[Note: This posting uses 5-digit years.]

Rewind three weeks…

02010.08.03 at 2:10pm @mkapor tweeted:

I’m grouchy that so few people (except us old-timers) have even heard of Ted Nelson (Wikipedia bio)

Observe: Mitch Kapor hyperlinked into the Wikipedia.

02010:08.03 at 2:11pm @mkapor tweeted:

All of the web is in essence a pale shadow of just one of Ted Nelson’s dreams. Now do I have your attention?

Hmm… Who is Mitch Kapor?

2010.08.03 at 2:15pm @nanofoo in reply to @mkapor:

I’m going to make sure my CS1 students learn a bit about Ted Nelson this fall. They’ll come in knowing Gates & Jobs, but not Nelson.

@nanofoo never got reply from @mkapor, but @rossk did…

02010.08.03 at 2:48pm @rossk in reply to @mkapor:

where should the Nelson-newbie start?

02010.08.03 at 8:07pm@mkapor in reply to @rossk

Read “Computer Lib” by Nelson. Also see the Wired article on him for a dissenting view

Mitch Kapor did not provide his followers with hyperlinks, but here they are: Computer Lib/Dream Machine (dot-pdf) and The Curse of Xanadu

This posting uses 5-digit years and that is because I am a member of the Long Now Foundation. [Follow on Twitter @longnow.]

In 02002, Mitch Kapor made the first Long Bet (By 02029 no computer – or “machine intelligence” – will have passed the Turing Test.) with Ray Kurzweil. Hmm… Who is Ray Kurzweil? [Follow on Twitter @KurzweilAINews.]

02010.08.25 at 12:00pm @compufoo tweeted:

I agree with Ray Kurzweil that “exponential growth is the reality of information technology.” #future

I learned a lot from reading reading Ray Kurzweil’s The Singularity is Near.

In a nutshell, Kapor and Kurzweil are futurists whose last names start with the letter ‘K’. There another futurist whose last name starts with the letter ‘K’ and that is Alan Kay. In the Introduction to Computer Science we use the C++ programming language. C++ supports object oriented programming (OOP) and Alan Kay is considered one of the fathers of OOP.

Alan Kay was once quoted saying: “The best way to predict the future is to invent it.”

The following picture of the Foundation Building at Arizona State University was taken during early August of 02010.

"The best way to predict the future is to invent it." -- Alan Kay

My 50s in Hex Equals My Dad’s 80s

19 June 2010

Truman is my dad. My birthday is at the end of May and Truman’s birthday is near the end of June. I turned 50 during May of 2007 and Truman turned 80 during June of 2007.

Now for number system tidbit… 50 in base-16 (hexadecimal, or hex) is 80 in base-10 (decimal) because five units of 16 plus zero ones equals eight units of 10 plus zero ones. In many programming languages, numbers prefixed with 0x (or 0X) are numbers being represented in hex. For example, 0x50 is 80, 0x51 is 81, 0x52 is 82, 0x53 is 83, …, 0x59 is 89.

Now to the point of this posting…During my 6th and Truman’s 9th decades of life, for 11 of every 12 months, my age in hex equals Truman’s decimal age. When Truman and I turn 90 and 60, respectively, my hex age will no longer equals Truman’s age (i.e. 0x60 is 96, not 90).

Note: This posting was published when I was 53 and Truman was 82 (i.e. my age in hex was one more than Truman’s age in decimal).

Frustrating parallel vs. sequential EDU moment…

25 February 2010

I shared the following tweet with CS1 students on 24 February 02010.

Think parallel instead of sequential. RT @MathIsMyLife:
Sun Microsystems Laboratories – WPI News

I spent a couple of minutes explaining the difference between sequential and parallel processing.

Here comes the frustrating moment…

The students at WPI (Worcester Polytechnic Institute) were getting an opportunity to hear a lecture by computing guru Guy Steele titled “The Future Is Parallel: What’s a Programmer to Do? Breaking Sequential Habits of Thought.”

After covering the tweet I told the class… “We now return to our regularly scheduled sequential thinking.”