Archive for the ‘History of Computing’ Category

Alan Turing: a household name?

November 25, 2012

A hundred years have elapsed since the birth of the mathematician, codebreaker, and father of computer science, Alan Turing.

Due to space restrictions, a drastically shorter version of  what follows appeared on page 16 of the November/December ACS Information Age magazine.

In response to an online petition in the lead-up to the centenary of Alan Turing’s birth, the British PM, Gordon Brown, said in 2009: “Turing was dealt with under the laws of the time, and we can’t put the clock back, his treatment was utterly unfair. On behalf of the British government and all those who live freely thanks to Alan’s work, I am very proud to say: we’re sorry. You deserved so much better.” [1] This statement concerned the appalling treatment Turing received for “gross indecency”, the charge made against him as a homosexual person living in the UK in the mid-20th century. His choices were chemical castration and jail. He chose the former, which affected his concentration and self-esteem, undoubtedly contributing to his apparent suicide via a cyanide-dipped apple in 1954.

It would be an understatement to say that Turing achieved much in his 42 years. He contributed to a fundamental problem in mathematics, in the process becoming the father of computer science prior to the existence of general purpose computing machines. He played a pivotal role in the Second World War as a Bletchley Park cryptanalyst for which he was awarded an OBE, wrote a seminal paper on the modeling of biological growth, worked on pioneering computer projects, and founded the field of Artificial Intelligence (AI).

For anyone in the computing field, Turing’s most important contribution was his 1936 paper “On Computable Numbers” and in particular, the abstraction he described and used to present the halting problem, now known as the Turing Machine, the conceptual essence of a general purpose computer.

Turing was keenly interested in algorithms and applications, independently arriving at the utility of the subroutine library. He wrote and optimised early library routines e.g. for long division, random number generation and investigated numerical analysis problems such as rounding errors. He wrote code relating to number theory for the Manchester computer. He also wrote a chess program that was only simulated on paper due to a lack of computer time being made available.

For those with more pragmatic inclinations, from 1945 to 1951, after his time at Bletchley Park, contemporaneous with ACS founder John Bennett’s work on CSIRAC, Turing was involved with pioneering computer projects including the design of the Automatic Computing Engine (ACE, later built as the Pilot ACE), The Manchester Baby or Small Scale Experimental Machine (SSEM), and the Ferranti Mark I, for which he wrote the Programmer’s Manual in 1951. The Pilot ACE is on display in the London Science Museum. His design frequently changed, was optimal in terms of hardware, but complex to program. Turing said: “In working on the ACE I am more interested in the possibility of producing models of the brain than in the practical applications to computing”. [2]

Early insights into the nature of AI set down in a paper entitled “Computing Machinery and Intelligence”, published in the philosophy journal Mind in 1950, led to his notion of an “Imitation Game”, the now famous “Turing Test”, a means by which to determine whether a questioner is communicating with an entity with human-level AI.

The ACM presents the Turing Award annually to someone who has contributed something that is judged to be of major and lasting importance to the computing science field. One of its recipients, Alan Perlis, in 1966 said: “On what does and will the fame of Turing rest? That he proved a theorem showing that for a general computing device—later dubbed a “Turing Machine”—there existed functions which it could not compute? I doubt it. More likely it rests on the model he invented and employed: his formal mechanism. This model has captured the imagination and mobilized the thoughts of a generation of scientists”.

Arguably, Turing is to Computing as Einstein is to Physics. In 2005, there were celebrations worldwide of Einstein’s “year of miracles”. This year there have been similar celebrations of Turing’s birth 100 years ago. [3] Einstein and E=MC2  are well known, but can the same be said of Turing and his Machine? Is he a household name along with Einstein? Many take for granted the existence of the computer, smart phones, and a myriad other computationally-enabled devices found in virtually every facet of our modern lives. We, as computing professionals, should strive to make better known the work of Turing and his contemporaries, and more generally, the broader history of our field.

I looked into the possibility of an Adelaide cinema screening of the film, CODEBREAKER, about Turing’s life (via TodPix) but received a response to say that there are no plans for a theatrical release in Australia; it was screened on SBS One in June 2012 on a 3 year contract, so perhaps it will be aired again.

Update (November 2013): CODEBREAKER is now available for sale on DVD!

References

  1. http://www.abc.net.au/radionational/programs/scienceshow/alan-turing-e28093-thinker-ahead-of-his-time/4034006
  2. Lavington, S. (ed.), 2012, “Alan Turing and his Contemporaries”, BCS
  3. http://amturing.acm.org/acm_tcc_webcasts.cfm
  4. http://www.turingfilm.com
  5. http://www.turing.org.uk/turing/
  6.  “The ACM Turing Award Lectures: The First Twenty Years”, 1987, ACM Press

Explaining how computers work with the TEC-1

May 30, 2010

I was recently asked to give a talk to my son’s primary school class about how computers work.

The Powerpoint slides from the talk consist mostly of pictures and towards the end, a small Z80 machine code program (to add a number to itself) for the TEC-1 single board computer.

TEC-1 image from Issue 10 of Talking Electronics Magazine

TEC-1 from Issue 10 of Talking Electronics Magazine cover

My wife and I created a short video showing the program being entered and executed multiple times via the TEC-1’s hex keypad.

As I told the kids during that talk, if you want to understand how a computer really works, you need to get close to the machine-level and talk about processors, memory, buses and so on. So we did, and despite leaving out a lot of details, I think the idea of going from X = X+X to a sequence of simple instructions and a numeric representation palatable to a Z80 made some sense to many of the kids, and at least provided a source of fascination to most. Apart from that, I think it was fun.

We also spent a lot of time talking about the extent to which computers now pervade our lives and how much we take that and the people whose ideas and work made it all possible for granted, including Babbage and Lovelace, Leibniz, Boole, Turing, and so many hardware and software pioneers.

Like many hobbyists in the 70s, 80s and beyond, the idea of building a simple computer from components in a kit was alluring. I’ve been doing paid software development for almost 30 years but was a hobbyist for more than a decade or more before that. I was introduced to the Joy of Computing in Year 10 due to the purchase of a PDP-11/04 by my school (Norwood High in Adelaide) in the late 70s. Along with a love of astronomy that continues to this day, I maintained an interest in programming throughout the 80s, during which time I was a nurse. I eventually decided to convert one of my hobbies into a profession, but still maintain the attitude of a hobbyist, developing open source software such as my current project: VStar.

My hope is that I’ve instilled in at least some of those kids a hunger to know more about computers and programming.

Lisp’s 50th birthday

October 29, 2008

John McCarthy‘s Lisp programming language —is 50 years old (October 2008). Lisp is the second oldest programming language still in use today, next to Fortran.

John McCarthy

John McCarthy

Lisp50 at OOPSLA 2008 celebrated Lisp’s contributions.

I celebrated by giving a talk to the Australian Java User Group in Adelaide about Clojure, a new dialect of Lisp for the JVM.

There’s a lot of interesting material to be found by Googling, but here are a few relevant links:

A decade ago I developed LittleLisp for the ill-fated Newton PDA.

There’s a nice parody song called The Eternal Flame which is all about Lisp, and here’s some amusing xkcd Lisp cartoons:
Lisp still looms large:
  • in Emacs as e-lisp;
  • it has mature free implementations (e.g. take a look at PLT Scheme);
  • and active commercial implementations (e.g. the LispWorks mailing list is very active).
Lisp refuses to lay down and die. In his 1979 History of Lisp paper John McCarthy said:

One can even conjecture that LISP owes its survival specifically to the fact that its programs are lists, which everyone, including me, has regarded as a disadvantage. 

In ANSI Common Lisp, Paul Graham points out that Lisp has always put its evolution into the hands of its programmers, and that this is why it survives, especially via the macro systems as found in some dialects (e.g. Common Lisp, Clojure), which make the full power of the language available to generate Lisp code at compile time.

Irrespective of how widely used Lisp dialects are today, we should continue to remember its contributions to programming: code as data, higher order functions, application of functions to the elements of a list, an emphasis upon recursive solutions to problems, erasure of abandoned data (garbage collection), the Read-Eval-Print Loop (REPL), to name a few.

As for the future, it’s always uncertain. Here are some notes about the future of Lisp from the OOPSLA Lisp50 session, which suggests that Clojure may be a big part of that. Next year’s International Lisp Conference has the working title “Lisp: The Next 50 Years”. 
 
I’ll end with a quote from Edsger Dijkstra:

—Lisp has jokingly been called “the most intelligent way to misuse a computer”. I think that description is a great compliment because it transmits the full flavor of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.

On the importance of pure research

January 14, 2007

I recently finished reading the book Engines of Logic (2000) by Martin Davis (apparently published as The Universal Computer in some countries) of Davis-Putnam SAT-solver algorithm fame, a book about the origins of computer science from the viewpoint of the mathematicians who founded it, in particular: Leibniz, Boole, Frege, Cantor, Hilbert, Godel and Turing.

Leibniz had the notion that it ought to be possible to be able to write down ideas in a language (he called this a universal characteristic) such that “serious men of good will” could sit together to solve some problem by calculation using an algebra of logic he referred to as the calculus ratiocinator.

Despite attempts at such a language and algebra of logic by Leibniz, it was ultimately the work of his successors that gave rise to the logic that made automated computation possible.

Of Leibniz’s work Davis said that “What Leibniz has left us is his dream, but even this dream can fill us with admiration for the power of human speculative thought and serve as a yardstick for judging later developments.”

In the epilogue, Davis had this to say:

The Dukes of Hanover thought they knew what Leibniz should be doing with his time: working on their family history. Too often today, those who provide scientists with the resources necessary for their lives and work try to steer them in directions deemed most likely to provide quick results. This is not only likely to be futile in the short run, but more importantly, by discouraging investigations with no obvious immediate payoff, it shortchanges the future.

These days, universities and it seems, too many aspects of society are becoming shackled to the oft-times short sighted and petty expectations of business, as if it mattered as an end in itself. We would do well to pay attention to history.

On the subject of history, it occurs to me increasingly that most of what we study is in fact historical in nature. Incremental advances in computer science, software engineering,  astronomy, and Science in general are mere blips on the vast landscape of accumulated knowledge. When I read books such as Engines of Logic and The Art of Electronics, I am overwhelmed by the contributions of countless scientists and engineers over decades, to say nothing of the work of the founders of Science such as Copernicus, Galileo, Kepler, Newton, and Einstein.

Kierkegaard and Stroustrup

December 15, 2006

This Lambda the Ultimate post pointed to an interview with the creator of the C++ programming language Bjarne Stroustrup in which he says he was influenced by the 19th century philosopher Soren Kierkegaard. It immediately reminded me of a Kierkegaard quote to which I find myself drawn over and over:

What I need to make up my mind about is what I must do, not what I must know, except insofar as knowledge must precede every action…The vital thing is to find a truth which is truth for me, to find the idea for which I can live and die. Of what use would it be for me to discover a so-called objective truth…if it had no deeper significance for me and my life? (Soren Kierekgaard)

I am still very much in search of this “idea”. I first saw this quote on Julia Watkin’s University of Tasmania website. During the brief time that I knew her, I enjoyed talking with Julia about philosophy and other subjects. Sadly, Julia is no longer with us. I wonder what she would have had to say about Stroustrup’s interview comments re: Kierkegaard?

I went back to Stroustrup’s book, The Design and Evolution of C++ (Addison-Wesley, 1994) to see what he had originally said about Kierkegaard. Here are the relevant excerpts (page 23):

I have a lot of sympathy for the student Euclid reputedly had evicted for asking, “But what is mathematics for?” Similarly, my interest in computers and programming languages is fundamentally pragmatic.

I feel most at home with the empiricists rather than with the idealists…That is, I tend to prefer Aristotle to Plato, Hume to Descartes, and shake my head sadly over Pascal. I find comprehensive “systems” like those of Plato and Kant fascinating, yet fundamentally unsatisfying in that they appear to me dangerously remote from everyday experiences and the essential peculiarities of individuals.

I find Kierkegaard’s almost fanatical concern for the individual and keen psychological insights much more appealing than the grandiose schemes and concern for humanity in the abstract of Hegel or Marx. Respect for groups that doesn’t include respect for individuals of those groups isn’t respect at all. Many C++ design decisions have their roots in my dislike for forcing people to do things in some particular way. In history, some of the worst disasters have been caused by idealists trying to force people into “doing what is good for them.” Such idealism not only leads to suffering among its innocent victims, but also to delusion and corruption of the idealists applying the force. I also find idealists prone to ignore experience and experiment that inconveniently clashes with dogma or theory. Where ideals clash and sometimes even when pundits seem to agree, I prefer to provide support that gives the programmer a choice.

In Julia Watkin’s book Kierkegaard (Geoffrey Chapman, 1997, pages 107-108), she had this to say:

In his use of the Socratic method, Kierkegaard strove to keep his own view to himself through the use of pseudonyms, acting as an “occasion” for people’s discovery and self-discovery instead of setting himself up as a teaching authority or arguing the rightness of his own ideas. I would urge that it is this feature of Kierkegaard’s writing that makes him especially effective in a time when two main tendencies seem to be especially dominant – a pluralism that accepts the validity of all views but stands by the correctness of no particular view of the universe, and a scientific or religious fundamentalism that is rigidly exclusive of views other than its own. Kierkegaard avoids the pitfalls of both trends, and he also does something else; he makes room for truth, both intellectual and existential, through encouraging people to be open-minded, to be aware of the spiritual dimension of existence, and to venture in life as well as in thought.

Although Stroustrup remarked in the interview referred to above that he is “…not particularly fond of Kierkegaard’s religious philosophy”, there is some resonance between his comments and Julia’s analysis.

SILLIAC – The first Australian computer built within an Australian University

October 30, 2006

It’s the 50th Anniversary of Australia’s SILLIAC computer. Here’s a great website with videos and podcasts.

While you’re at it, check out the slightly older CSIRAC (1947). I visited CSIRAC at the Melbourne Museum in April this year and found it to be an inspiring and humbling experience.

Here’s a short article entitled 60 years of change that talks about the changes that have occurred from Eniac onward.