The influential American engineer Charles P. Thacker died on June 19, aged 74.
Thacker designed the Alto personal computer at Xerox PARC in the 1970s which influenced development of the Mac after Steve Jobs saw it during a visit to PARC.
He also contributed to the development of Ethernet, Tablet PCs, and laser printers.
The computer scientist Butler Lampson, one of Thacker’s colleagues at Xerox PARC and later at Microsoft has spoken about his ability to see what was important and his breadth of coverage:
He could do everything from circuit design all the way through hardware architecture and programming and user-interface design.
The Association for Computing Machinery and IEEE Computer Society recently honoured Charles P. Thacker with the Eckert-Mauchly Award.
I enjoyed reminiscing about BASIC when it recently turned 50, on May 1 2014. I learned more about the events surrounding the creation of Dartmouth BASIC from the Dartmouth web pages and especially from interview videos with co-inventors John Kemeny and Thomas Kurtz. Given my development of ACE BASIC for the Amiga in the mid-90s, the BASIC programming language has a special place in my heart. More about ACE shortly. My first experience with BASIC and programming in general was in 1977, in my second year of high school (Norwood High). Our class marked up a deck of cards (in pencil) with a BASIC program and submitted them to Angle Park Computing Centre. A week or two later I remember receiving a printout of a partial run, showing an ASCII plot of some function (a deceleration curve I think) tantalisingly cut short by an error, the details of which I don’t recall.
At the time I thought that was an interesting experience but set it aside. As I described here, in 1978, the school bought a PDP-11 and installed it in an air-conditioned room complete with a card reader, printer, and terminals. I remember seeing the machine for the first time, gawking in wonder through the glass window in the door to the room. For the first 6 months most students were only allowed to create card decks rather than using a terminal. At least the turnaround time was less than for Angle Park: you could get your program run’s print-out, correct errors in your card deck and submit it again via the card reader’s hopper.
Apart from a small amount of class-time exposure to the machine, I became a “computer monitor”, assigned on a roster to be there while others used the computer, given a modicum of responsibility for looking after the machine (e.g. card reader or printer problems, booting) but I didn’t learn too much more about the PDP-11 that way.
What really hooked me, was eventually being allowed to use the terminals (pictured at right) and the interactive BASIC programming that entailed. There was plenty of competition for terminal time! One of the first interactive programs I wrote was a simple guess-the-number game in which the user was told whether a guess was less or greater than the number the machine was “thinking” of. It seems trivial now but that experience of interacting with an “artificial intelligence” (as it seemed to me at the time) was intoxicating and this has stayed with me. Some fellow students started playing around with machine language on the PDP-11; that was a little beyond me at the time but an understanding of that level would become important for me later.
In the late ’70s, Tandy had a computer store in Gawler Place, Adelaide. I used to catch a bus into town on Friday nights, pull up a chair at a TRS-80 Model 1 on display and sit there for an hour or two typing in BASIC source code for games from a book; the sales people didn’t seem to mind too much. 🙂
When I’d finished year 12 of high school, had started working as a nurse in 1981, and was earning money, I bought a CASIO FX-702P, essentially a calculator with an interface for a cassette recorder (for programs and data and printer that was programmable in BASIC. Within a year or so, I had a Sinclair ZX-81 connected to my parents’ old HMV black and white TV in the country (where I visited most weekends): a big screen indeed! This odd little machine fired my imagination via its space-age programming manual cover. Adding the 16K RAM expansion pack (shown below at rear) allowed much larger programs to be written compared to the unexpanded 1K machine. Programming in BASIC while listening to music like Kraftwerk’s Computer World, with simplistic, German-accented lyrics like these:
I program my home computer. Beam myself into the future.
it somehow seemed that the future was coming fast and that it was going to be overwhelmingly positive. This was a time of innocent joy when nothing was standardised (hardware or operating systems), the term micro-computer was more likely to be used than personal computer, the sterile DOS-based IBM PC “business computer” was barely beginning to emerge and the Macintosh was not yet in sight.
The pages of magazines like Australian Personal Computer and Compute! were filled with BASIC program listings for specific machines just crying out to be adapted to other BASIC dialects. Reading books such as Christopher Evans’ The Mighty Micro (1979) filled me with optimism for the future. Reading Isaac Asimov’s I, Robot and the like fired my imagination, as did TV shows like Dr Who and Blake’s 7. To be honest, all of this was also somewhat of a welcome escape from the daily realities of being a young nurse.
My next machine was a Commodore PET (CBM 4016). Built like a Sherman tank, I added a 5.25″ floppy disk drive (that had cooling problems!) and a dot matrix printer via the PET’s IEEE interface. I happily spent many weekends creating games in BASIC on this computer. I also wrote a version of Eliza-the-psychotherapist that kindled an interest in artificial intelligence and language processing. Occasionally entering the PET’s machine language monitor programming got me thinking more about low-level concepts (processor registers etc). Reading a book called Programming the 6502 by Rodnay Zaks (1981) helped further my understanding. That PET was followed by a VIC-20 and C-64 into the mid-80s both of which I (of course) programmed in BASIC and a bit of hand-assembled 6502/6510 machine code POKEd into obscure areas of memory (such as the cassette buffer, not in use when a 5.25″ floppy disk drive was the secondary storage device). I started to gain some exposure to databases (SuperBase 64), word processors and other programming languages like Pascal. Interfacing with relay boards and sensors was also something I enjoyed using BASIC for with these machines, achieved by POKEing values into and PEEKing values from I/O memory locations. In 1987, after a couple of years going off in various directions, I moved from Adelaide to Tasmania to work as a nurse (in ICU, Recovery etc) where I met my future wife, Karen. I didn’t have any computer with me there because I initially thought I’d only stay for a year or so but ended up staying for a decade. My first computer purchase in Tasmania was an Acorn Electron, little brother to the BBC Micro and programmable in a BBC dialect of BASIC. I also learned a bit of LISP (from a cassette-loaded interpreter) using the Electron.
By far the most important computer purchase ever for me was a Commodore Amiga 500. I learned so much from that machine, initially programming it in AmigaBASIC and smatterings of machine code, then in C and a handful of other languages. The Amiga’s pre-emptive multi-tasking operating system and state of the art graphics and sound capabilities were fantastic. It was also to this machine that I connected my first hard disk drive. I wrote simple astronomy programs, a simple drawing program for Karen, and created an alarm and security system with infra-red sensors, keypad, strobe light etc. It even woke me (or more likely Karen so she could come to my aid) up if I went for a sleep walk. 🙂 I also used the Amiga and C64 for the pre-Internet Viatel (Australia’s Teletex system), bulletin boards, and Compuserve.
I took a statistics course at UniTas (Launceston) for fun in 1987 and a year or so later had started an Applied Computing degree there. I took a double major in computer science and philosophy. This ultimately lead me away from a career in nursing and onto a software engineering career (after stints as a computer systems officer and a junior academic post-graduation). One of the subjects I took as an undergraduate was “Advanced Programming” in which we wrote a compiler for a subset of Pascal into p-codes (similar to UCSD p-codes and not unlike Java VM bytecodes) rather than assembly or machine code for the native machine (Intel). One outcome is that I became increasingly interested in programming language translation and programming paradigms (especially object oriented, functional, logic and concurrent). Another outcome is that I resolved to take that knowledge and write a compiler for the Amiga for a language that I myself would want to use, not just as an academic exercise.
In October 1991, I started development of ACE BASIC for the Commodore Amiga computer. It was released to testers in March 1992 and made available for public consumption in February 1993. Like the original Dartmouth BASIC, ACE was compiled, unlike many implementations that have been interpreters. ACE was a compiler for the interpreted Microsoft AmigaBASIC that shipped with the Amiga.
This article written for the online Amiga Addicts journal gives some idea of the history and motivations for ACE and here is an interview I gave in 1997 about ACE. Although the instruction set of the Amiga’s 68000 processor was not quite as orthogonal as the PDP-11’s, it was still really nice. ACE compiled BASIC source into peephole optimised 68000 assembly code.
This was assembled to machine code by Charlie Gibbs’ A68K assembler and linked against library code with the Software Distillery’s Blink linker (later I also used PhxAsm and PhxLnk). I wrote 75% of ACE’s runtime libraries in 68000, later waking up to the idea that C would have been a more productive choice. One upside is that I became quite comfortable working with assembly language. I’ve made use of that comfort in recent years when working with hardware simulator testing (ARM, PowerPC etc) and micro-controller compilers.
A wonderful community of enthusiastic users built up around ACE. I wrote an integrated development environment, and a community member wrote one too (Herbert Breuer’s ACE IDE is shown below).
Another member wrote a “super-optimiser” that rewrote parts of ACE’s generated assembly code to be even faster than I managed with my simple optimisations.
ACE was guilty of a criticism by the Dartmouth BASIC co-inventors (John Kemeny and Tom Kurtz) levelled at many BASICs since their first: of being machine-specific. But then that was the intent for ACE: to make programming the Amiga more approachable to more people, combining the simple abstractions of BASIC with the unique features of the Amiga and the run-time efficiency of a compiled language like C.
Given the Amiga’s demise, around 1996 I moved onto other platforms. I wrote a LISP interpreter for the Newton PDA (also doomed; I can pick ’em!) between 1998 and 2000. That was fun and had a nice small community associated with it, but it didn’t match ACE and its community.
I eventually came to possess PCs, programming them with a smattering of GW-BASIC, quite a lot of Turbo Pascal, some Microsoft Quick C, a little assembly, and Visual BASIC.
When Java appeared in 1996 I greeted it with enthusiasm and have been an advocate of it and the Java Virtual Machine, as a professional and spare-time software developer, on and off ever since. These days I’m more likely to code in Java, C/C++, Python (where once I would have used Perl) or perhaps R rather than a BASIC dialect, none of which denigrates BASIC.
The fact is that BASIC made early microcomputers accessible such that many of us interacted with them in ways more directly than is possible with modern computers (PCs and Macs), despite all their advantages and power. Arguably, we expected less from the machines yet engaged in highly creative relationships with them. Anyone who has spent much time programming will recognise the allure. The interactive nature of these early BASIC machines only added to this.
I agree with the famous Dutch computer scientist Edsger Dijkstra when he says that:
Computing Science is no more about computers than astronomy is about telescopes.
I also sympathise with his declaration that the BASIC statement GOTO could be considered harmful, due to the “spaghetti code” it leads to. But I don’t really agree with his assessment that:
It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
I heard the same propaganda from a University lecturer. Apparently some us of were able to be “rehabilitated”. Then again, along with his comments about BASIC, Dijkstra made some unkind comments about other programming languages, including COBOL, Fortran, and APL, for example:
The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.
With apologies to Grace Murray Hopper, I have more sympathy with this last one. 🙂
The truth is that all programming languages are crude approximations of the Platonic ideal of bringing together two minds: one artificial, one natural. There are no programming languages about which I can say: there are no improvements possible here, and there are very few languages that make communion with the machine a beautiful experience. All are to be reviled and admired for different qualities. But BASIC, in all its dialects, with all its flaws, and with the benefit of 20/20 hindsight and large gobs of nostalgia, was mostly to be admired and was directly responsible for many of us falling in love with the idea and activity of programming.
If one character, one pause, of the incantation is not strictly in proper form, the magic doesn’t work. Human beings are not accustomed to being perfect, and few areas of human activity demand it. Adjustment to the requirement for perfection is, I think, the most difficult part of learning to program. (Frederick Brooks)
You think you know when you learn, are more sure when you can write, even more when you can teach, but certain when you can program. (Alan Perlis)
Increasingly, people seem to misinterpret complexity as sophistication, which is baffling – the incomprehensible should cause suspicion rather than admiration.
(Niklaus Wirth)
APL is a write-only language. I can write programs in APL, but I can’t read any of them. (Roy Keir)
What is the sound of Perl? Is it not the sound of a wall that people have stopped banging their heads against? (Larry Wall)
Beware of bugs in the above code; I have only proved it correct, not tried it.
(Donald Knuth)
Any code of your own that you haven’t looked at for six or more months, might as well have been written by someone else. (Eagleson is an optimist, the real number is more like three weeks.)
Niklaus Wirth has lamented that, whereas Europeans pronounce his name correctly (Ni-klows Virt), Americans invariably mangle it into (Nick-les Worth). Which is to say that Europeans call him by name, but Americans call him by value.
Real programmers don’t comment their code. It was hard to write, it should be hard to understand.
A LISP programmer knows the value of everything, but the cost of nothing.
Old programmers never die. They just branch to a new address.
Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it. (Kernighan)
I saw `cout’ being shifted “Hello world” times to the left and stopped right there. (Steve Gonedes)
Programming is understanding. (Kristen Nygaard)
Computing Science is no more about computers than astronomy is about telescopes. (Edsger Dijkstra)
I recently finished reading the book Engines of Logic (2000) by Martin Davis (apparently published as The Universal Computer in some countries) of Davis-Putnam SAT-solver algorithm fame, a book about the origins of computer science from the viewpoint of the mathematicians who founded it, in particular: Leibniz, Boole, Frege, Cantor, Hilbert, Godel and Turing.
Leibniz had the notion that it ought to be possible to be able to write down ideas in a language (he called this a universal characteristic) such that “serious men of good will” could sit together to solve some problem by calculation using an algebra of logic he referred to as the calculus ratiocinator.
Despite attempts at such a language and algebra of logic by Leibniz, it was ultimately the work of his successors that gave rise to the logic that made automated computation possible.
Of Leibniz’s work Davis said that “What Leibniz has left us is his dream, but even this dream can fill us with admiration for the power of human speculative thought and serve as a yardstick for judging later developments.”
In the epilogue, Davis had this to say:
The Dukes of Hanover thought they knew what Leibniz should be doing with his time: working on their family history. Too often today, those who provide scientists with the resources necessary for their lives and work try to steer them in directions deemed most likely to provide quick results. This is not only likely to be futile in the short run, but more importantly, by discouraging investigations with no obvious immediate payoff, it shortchanges the future.
These days, universities and it seems, too many aspects of society are becoming shackled to the oft-times short sighted and petty expectations of business, as if it mattered as an end in itself. We would do well to pay attention to history.
On the subject of history, it occurs to me increasingly that most of what we study is in fact historical in nature. Incremental advances in computer science, software engineering, astronomy, and Science in general are mere blips on the vast landscape of accumulated knowledge. When I read books such as Engines of Logic and The Art of Electronics, I am overwhelmed by the contributions of countless scientists and engineers over decades, to say nothing of the work of the founders of Science such as Copernicus, Galileo, Kepler, Newton, and Einstein.
This Lambda the Ultimate post pointed to an interview with the creator of the C++ programming language Bjarne Stroustrup in which he says he was influenced by the 19th century philosopher Soren Kierkegaard. It immediately reminded me of a Kierkegaard quote to which I find myself drawn over and over:
What I need to make up my mind about is what I must do, not what I must know, except insofar as knowledge must precede every action…The vital thing is to find a truth which is truth for me, to find the idea for which I can live and die. Of what use would it be for me to discover a so-called objective truth…if it had no deeper significance for me and my life? (Soren Kierekgaard)
I am still very much in search of this “idea”. I first saw this quote on Julia Watkin’s University of Tasmania website. During the brief time that I knew her, I enjoyed talking with Julia about philosophy and other subjects. Sadly, Julia is no longer with us. I wonder what she would have had to say about Stroustrup’s interview comments re: Kierkegaard?
I went back to Stroustrup’s book, The Design and Evolution of C++ (Addison-Wesley, 1994) to see what he had originally said about Kierkegaard. Here are the relevant excerpts (page 23):
I have a lot of sympathy for the student Euclid reputedly had evicted for asking, “But what is mathematics for?” Similarly, my interest in computers and programming languages is fundamentally pragmatic.
I feel most at home with the empiricists rather than with the idealists…That is, I tend to prefer Aristotle to Plato, Hume to Descartes, and shake my head sadly over Pascal. I find comprehensive “systems” like those of Plato and Kant fascinating, yet fundamentally unsatisfying in that they appear to me dangerously remote from everyday experiences and the essential peculiarities of individuals.
I find Kierkegaard’s almost fanatical concern for the individual and keen psychological insights much more appealing than the grandiose schemes and concern for humanity in the abstract of Hegel or Marx. Respect for groups that doesn’t include respect for individuals of those groups isn’t respect at all. Many C++ design decisions have their roots in my dislike for forcing people to do things in some particular way. In history, some of the worst disasters have been caused by idealists trying to force people into “doing what is good for them.” Such idealism not only leads to suffering among its innocent victims, but also to delusion and corruption of the idealists applying the force. I also find idealists prone to ignore experience and experiment that inconveniently clashes with dogma or theory. Where ideals clash and sometimes even when pundits seem to agree, I prefer to provide support that gives the programmer a choice.
In Julia Watkin’s book Kierkegaard (Geoffrey Chapman, 1997, pages 107-108), she had this to say:
In his use of the Socratic method, Kierkegaard strove to keep his own view to himself through the use of pseudonyms, acting as an “occasion” for people’s discovery and self-discovery instead of setting himself up as a teaching authority or arguing the rightness of his own ideas. I would urge that it is this feature of Kierkegaard’s writing that makes him especially effective in a time when two main tendencies seem to be especially dominant – a pluralism that accepts the validity of all views but stands by the correctness of no particular view of the universe, and a scientific or religious fundamentalism that is rigidly exclusive of views other than its own. Kierkegaard avoids the pitfalls of both trends, and he also does something else; he makes room for truth, both intellectual and existential, through encouraging people to be open-minded, to be aware of the spiritual dimension of existence, and to venture in life as well as in thought.
Although Stroustrup remarked in the interview referred to above that he is “…not particularly fond of Kierkegaard’s religious philosophy”, there is some resonance between his comments and Julia’s analysis.