Archive for the ‘Programming’ Category

Ace meets ACE BASIC

May 6, 2023

Nic told me about a small Dr Who fan event in Adelaide on May 6th, so we attended it. It was too difficult for Karen to be there when sleeping between night shifts.

Sophie Aldred, who played Ace, the last companion of the last Doctor (played by Sylvester McCoy) in the old series, was the guest.

She reminisced about her time in the TARDIS, allowed photos to be taken with her, signed stuff for people. Down to earth and entertaining, her voice seems unchanged since those BBC TV days. Remembrance of the Daleks (an eminently watchable classic) and Survival (a classic in its own way, especially with respect to understanding more about Ace) were screened while rows of people single-filed out for photo-ops and autographs at various times during the day.

In the Q&A session, Sophie commented upon how she did as many of her own stunts as the producers would let her. She also related how John Nathan-Turner suggested that various costumes be taken home at the end of episodes when things were coming to an end in 1989, not being able to say directly what was coming. After having her “Ace jacket” at home for 30+ years, Sophie was able to bring it to the set when reprising her role in Power of the Doctor in 2022. The one in the photo below was made by a local fan’s Mum for the Adelaide event.

I was born 3 days after Dr Who first aired on 23rd November 1963, so grew up with all the Doctors. My vivid memories of Dr Who start with Patrick Troughton’s and Jon Pertwee’s Doctors. Before McCoy, my favourite Doctor was, and to a large extent always will be, Tom Baker’s. Visiting William Hartnell’s episodes again when I was older was necessary to really appreciate him. As for companions, there are too many good ones to choose between. Apart from Sara Jane and Ace, of course…

Karen and I started going out together in 1988, in the middle of the too-short tenure of McCoy and Aldred, so they were “our” Dr Who and companion.

In the early 90s I created the ACE Basic compiler for the Amiga. The ACE Programmer’s Reference Manual starts with:

What is ACE?

  • AmigaBASIC Compiler with Extras?
  • A Creative Environment?
  • A Compiler for Everyone?
  • A Cool Enterprise?
  • Automatic Computing Engine (ala Alan Turing)?
  • Dr Who’s last companion?

I related all this to Sophie at autograph time, along with the fact that Nic and I being there represented two generations of Dr Who fans. She seemed to enjoy hearing it all.

Contrary to a lot of my posts these days, there’s nothing particularly deep to say here, except that sometimes it’s just a Good Thing to look after yourself a little bit, have some fun, reminisce about simpler times when anything seemed possible, even spend a little money on “frivolous” things now and then.

A quarter century of Java

July 4, 2020

We are all natives now, and everybody else not immediately one of us is an exotic. What looked once to be a matter of finding out whether savages could distinguish fact from fancy now looks to be a matter of finding out how others, across the sea or down the corridor, organise their significative world. (Clifford Geertz)

The Java Programming Language turned 25 years old on May 23 2020.

I’ve been writing Java code for 24 years, since 1996.

When Java arrived on the scene, my languages of choice were C, C++ and Perl, but mostly the first and last.

For me, there were at first various BASIC dialects, with smatterings of assembly code. Then at university and just at the moment I thought I knew something about programming, I learned Pascal, C and Fortran.

Later, languages such as LISP, Prolog, ML, CLIPS, SQL, even COBOL (about which I really never could find anything to like) showed me different ways in which a computer could be programmed.

Programming paradigms and language translation (compilers, interpreters) have interested me since around 1989. I wrote the ACE BASIC compiler in C from 1991 to 1996.

I’ve since designed and implemented other programming languages, including LittleLisp for the Newton PDA, and others that were only ever experiments, most personal projects, one commercial. In recent times I’ve been developing a domain specific functional language called VeLa that I’ll write about some time.

From the start, Java made it possible to write cross-platform programs, with or without GUIs, with relative ease.

Its byte-code compilation model resonated with me after having written my first compiler for a subset of Pascal in 1989 that generated USCD p-codes.

In 2004, six years after Java was released, Java 5 brought generics, regular expressions and assertions. I had craved these, especially the first two, for years, since writing C++ and Perl code in the early to mid-90s.

Java’s history is something of a long strange trip: from set-top boxes to web applets, the desktop, enterprise frameworks, smart cards and other embedded devices, to Just In Time compilation and dynamic profiling, WebStart and security model criticisms, and from Sun Microsystems to Oracle.

C# learned from Java’s mistakes. Some of its proponents have not always seemed to me to be honest about acknowledging its heritage though.

Programming Languages are tools for thought, for human communication, and not merely a means by which to bend a machine to our will. As I’ve mentioned elsewhere, they can also determine what it is possible for us to think.

My working days are now dominated by R, Python, C++ and sometimes Fortran; and bash of course. R and Python are powerful and have a rich ecosystem of tools and libraries, but I’m no fan of dynamic typing which makes it easy to get started, yet leaves too much unsaid, too many possibilities for error. This problem only becomes worse as programs grow.

Rust, Julia, Go, Clojure, Swift are all contenders to the throne now too. Which of these will stand the test of time? Which will one day be considered to have legacy code bases, just like Java, C, C++, Fortran, COBOL? All of them, I would say.

Then there are those languages like Haskell that blaze trails but tend not to be dominant. Java and most of the current crop owe such languages a debt of gratitude as functional programming’s higher order functions, type inference, and immutability have come out of academia to take over the world.

Distinguishing fact from fancy, fashion from solid ground, making predictions about the future of the programming language landscape is no easy task.

The older I get, the more selective I become about what new languages and frameworks to learn and the more interested I am in as much static typing, type inference, and run-time efficiency as I can get my hands on.

Yet, after a quarter century, I still like to code in Java whenever I can. It has its own beauty. I don’t pay much attention to its detractors. Of course, that’s where it’s important to think about context. If I’m writing code for a tiny embedded system or want to squeeze the most out of a machine, I’m more likely to code in C or C++ than anything else. But if I want to build a big habitable castle in the sky, I’ll still choose Java when I can.

ACE, optimisation and an old friend

December 29, 2017

In late November I received an email from an old friend: Sean Miller. Sean was a member of the wonderful community that built up around ACE BASIC, a compiler for the Amiga I developed as a labour of love between 1991 and 1996. I’ve written about ACE in this blog before. Sean told me how his use of ACE influenced him over the years. It has been great to get back in contact with him.

I felt honoured and humbled when, on Christmas Eve, Sean released an episode about ACE on the Raising Awesome YouTube channel he and his son have created. In this episode (Retro Amiga Computing – ACE BASIC and Questions2.1 Development):

Sean shows how to use ACE on an Amiga emulator to compile and run a program he wrote more than 20 years ago (Questions).

Retro Computing with ACE

I’ve expressed this in email to Sean, but let me say it publicly: thank you Sean! It means more to me than I can say.

During the video, Sean comments on the progress of the compilation of Questions, notes that there were around 4000 peephole optimisations (see screenshot from video above) and wonders whether I might explain what a peephole optimisation is. I’d be happy to of course. Now you’ve got me started! 🙂

ACE generates assembly code for the beautiful Motorola 68000 microprocessor. Compilation of some ACE language constructs generates sub-optimal assembly code instructions on the first pass. Assembly code is emitted as ACE parses the input source code without any knowledge of the broader context of the program.

Here’s a trivial ACE program:

x%=42
y%=x%*3

This simply stores 42 in the short integer variable x% (the type is denoted by %), multiplies x% by 3 and stores the product in the variable y%. I chose integer over floating point for this example because the generated assembly is more complex and would distract from the explanation. Speaking of distractions…

As an aside, unlike modern Intel, ARM and other processors, the 68000 didn’t have a floating point unit (FPU), so floating point operations were carried out by library code instead of hardware, such as a Motorola Fast Floating Point or IEEE 754 library. As an aside to my aside, the Amiga 500 had a 68000 processor whereas the Amiga 1200 (I owned both eventually) had a 68020. The 68020 could offload floating point instructions (which it did not know how to handle) to a co-processor. The 68040 was the first 68k processor with an on-board FPU. This is a whole topic by itself.

Back to the trivial example ACE program above…

Here’s the 68000 assembly ACE generates for the two line program without any optimisation (i.e. without the -O option):

    move.w  #42,-(sp)
    move.w  (sp)+,-2(a4)
    move.w  -2(a4),-(sp)
    move.w  #3,-(sp)
    move.w  (sp)+,d0
    move.w  (sp)+,d1
    muls    d1,d0
    move.l  d0,-(sp)
    move.l  (sp)+,d0
    move.w  d0,-(sp)
    move.w  (sp)+,-4(a4)

With optimisation we have 6 assembly instructions instead of 11:

    move.w  #42,-2(a4)
    move.w  -2(a4),-(sp)
    move.w  #3,d0
    move.w  (sp)+,d1
    muls    d1,d0
    move.w  d0,-4(a4)

Looking at the first two lines of the 11 unoptimised sequence:

    move.w  #42,-(sp)
    move.w  (sp)+,-2(a4)
lifo_stack1
Example stack operations (source: goo.gl/5EuhjG)

ACE examines this pair in a sliding window, or so-called peephole, onto the emitted instructions and notices that 42 is being pushed to the first-in, last-out stack then immediately popped from the stack and stored into the variable x%’s address, represented by an offset of two from an address stored in the register a4. The peephole optimiser reduces this push-pop pair to a single instruction:

    move.w  #42,-2(a4)

ACE stops short of taking the newly optimised pair:

    move.w  #42,-2(a4)
    move.w  -2(a4),-(sp)

then peephole optimising it and emitting this:

    move.w  #42,-(sp)

The reason is that the programmer has asked for 42 to be stored in the variable x%.

More ideally would have been this sequence:

move.w  #42,-2(a4)
move.w  -2(a4),d0
muls    #3,d0
move.w  d0,-4(a4)

which literally says:

  • move 42 into variable x%’s memory location
  • move the value stored at x%’s memory location into the 68k register d0
  • carry out a signed multiplication of 3 with the contents of register d0, storing the result in d0
  • move the contents of register d0 into variable y%’s memory location

If the constraints relating to use of x% and y% did not exist, the following would be sufficient to yield the product of 42 and 3 in 68k assembly:

move.w  #42,d0
muls    #3,d0

Notice that the 4 instructions after the multiplication (muls) in the unoptimised sequence are optimised during more than one pass over the assembly code to a single instruction that stores the product into y%, from this:

    move.l  d0,-(sp)
    move.l  (sp)+,d0
    move.w  d0,-(sp)
    move.w  (sp)+,-4(a4)

to this:

    move.w  d0,-4(a4)

So, ACE does better with this than the instruction sequence before the multiplication.

There are other simple optimisations carried out when the -O option is used, relating to numeric negation, but this example illustrates the key aspects.

Bernd Brandes later wrote a more powerful optimiser for ACE, the SuperOptimiser, that built upon this simple peephole optimisation approach.

Every instruction the processor doesn’t have to execute means fewer CPU cycles, so a run-time speed up. This matters a lot for example, when such instructions are part of a loop that iterates many times.

To revisit ACE’s code generation and optimisation implementation, I downloaded and Vidar Hokstad’s improvements to the ACE source (on GitHub) for compilation under Linux. I compiled that on my Mac OS X laptop and used it to generate 68k assembly code. Vidar contacted me several years ago to say that he was engaging in “software archaeology” (that made me feel a bit old, even then) with the ACE source code. I appreciate Vidar’s efforts. He sorted out some compilation problems under the GNU C compiler (gcc) that I would have had to otherwise.

It’s interesting to look at the Intel assembly generated by gcc for a similar C code fragment. The following would have to be embedded in a function:

int x,y;
x=42;
y=x*3;

The gcc compiler generates this sequence:

    movl    $0, -4(%rbp)
    movl    $42, -8(%rbp)
    imull   $3, -8(%rbp), %ecx
    movl    %ecx, -12(%rbp)

As with the ACE generated 68k assembly, only the relevant part is shown. There’s additional code generated just to start up and shut down a program (by gcc, ACE or any other compiler). The Intel assembly generated here is a bit better than the optimised 68k code ACE generated (4 vs 6 lines) although surprisingly, not very much better.

When I wrote ACE in the 90s, all components were written either in C or 68000 assembly and I went straight from an implicit parse tree to assembly code generation. These days I tend to use ANTLR or similar tools for lexical analysis (converting character streams to tokens) and parsing (checking against language grammar). I have yet to use The LLVM Compiler Infrastructure for language development, but that’s on my list too.

Creating an intermediate representation (such as abstract syntax trees) before code generation, provides additional opportunities for optimisation, something I’ve been exploring in recent times. I’ll write more about that in another post.

To be honest, the more I think and write about this topic again, the more I want to.

Thanks again Sean.

Alto creator Charles P. Thacker dies

June 23, 2017

The influential American engineer Charles P. Thacker died on June 19, aged 74.

Thacker designed the Alto personal computer at Xerox PARC in the 1970s which influenced development of the Mac after Steve Jobs saw it during a visit to PARC.

He also contributed to the development of Ethernet, Tablet PCs, and laser printers.

The computer scientist Butler Lampson, one of Thacker’s colleagues at Xerox PARC and later at Microsoft has spoken about his ability to see what was important and his breadth of coverage:

He could do everything from circuit design all the way through hardware architecture and programming and user-interface design.

The Association for Computing Machinery and IEEE Computer Society recently honoured Charles P. Thacker with the Eckert-Mauchly Award.

On porting an ACE program to HTML5 (among other things)

March 1, 2017

In recent times I’ve been thinking about ACE BASIC, a compiler for the Amiga I stopped working on just over 20 years ago; nostalgia’s setting in I guess. A few years ago wrote a bit about ACE in relation to BASIC’s 50th; there’s more in these 1994 and 1997 articles.

As mentioned in the 50th post, a wonderfully thriving community built up around ACE between 1991 and 1996 centred upon an active mailing list and contributions of ideas, example programs and tools. I have been mildly (but pleasantly) surprised by a couple of things since I stopped development:

  1. Continued use of ACE on Amiga hardware and emulators for at least a decade afterward.
  2. A project to modify the code generator for post mid-90s Amiga operating systems and additional targets such as PowerPC and Intel.

Among other things, I’ve been thinking about a re-write for a modern platform or target, e.g. HTML5. The world of the 90s was still very platform-centric, but in the same year I stopped developing ACE, version 1.0 of the Java Development Kit was released, putting the power of Java and its  virtual machine into the hands of eager programmers. Java and JavaScript helped to consolidate the browser as an important platform and to define the shape of web development in a post-CGI (Common Gateway Interface, not Computer Generated Imagery) world.

A new compiler or interpreter is a non-trivial task, especially in my current spare-time-poor state, but I wanted to explore how an ACE program could be rewritten for an HTML5 context.

One of my favourite ACE programs was an implementation of IFS (Iterated Function Systems) to generate simple 2D structures such as ferns, trees, the Sierpinski Triangle and so on. So I started with this. It’s simple yet complex enough to allow for a comparison of approaches.

Here are a few observations on the original IFS ACE source code (ifs.b) and my initial HTML5 port of the code.

  • JavaScript in the browser appears to be faster than ACE on the Amiga. Sure, processors and clock speeds have improved since the mid-90s but ACE generated native 68000 assembly code. Then again, to think of JavaScript as an interpreted language is very simplistic with just in time compilation in widespread use.
  • The ACE code is quite data-centric. DATA statements containing comma-separated values are read into two dimensional arrays, so the original data is not close to where it’s used and it’s not clear what the numbers are associated with. I could have taken this approach in the port, but chose instead to create a data object, a map, close to the point of use, copying the original array names (bad in some cases: a, b, c, okay in others: xoffset, yscale) from the ACE program for use as map key names, to make a correspondence easier to see.
    • This meant first transposing the data (in Excel) so that DATA columns became rows.
    • Preserving the existing DATA organisation could be accomplished by introducing functions such as data() and read() that create and read, respectively, a pool of data values. For future DATA-centric ACE program ports, I’ll try that approach.
  • In ACE, the creation of a menu and its items is simple as shown by the creation of the Special menu below; this menu name is an Amiga convention. Shortcut key letters are optional.
menu 1,0,1,"Project"
menu 1,1,1,"Sierpinski Triangle"
menu 1,2,1,"Square"
menu 1,3,1,"Fern"
menu 1,4,1,"Tree #1"
menu 1,5,1,"Tree #2"
menu 1,6,1,"Sunflower"
menu 1,7,0,"-------------------"
menu 1,8,1,"Help...","H"
menu 1,9,1,"About...","A"
  • Compare this with the odd combination of HTML, CSS and JavaScript in this initial attempt at a port.
  • On the other hand, ACE’s (and so AmigaBASIC’s) reliance upon numeric identifiers is almost as unreadable as a collection of DATA statements. The MENU statements above declare the Project menu to be the first (from the left of a menu bar), with each menu item numbered in order of desired appearance and 1 or 0 enabling or disabling the menu item. Subsequent enable/disable operations on menus must refer to the same combination of numeric IDs, e.g. menu 1,2,0 would disable the Square item. Also, menu item selection handling is a bit awkward in ACE.

The code eventually morphed into what I’ve dubbed ACEjs, in the spirit of some other JavaScript library/frameworks. I’m not claiming any novelty here. The idea was to answer the question: how might ACE code look in a modern context? I’m less concerned with slavishly preserving the look and feel of the program, i.e. I’m not trying to make it look like it’s running on an Amiga. I just want to make it functionally equivalent.

Here’s a screenshot of the simple example ifs.b program in ACEjs form:

IFS in ACEjs

I don’t currently have a screenshot of ifs.b running on an Amiga or an emulator.

In any case, the outcome so far is that I have made progress toward an ACE-inspired JavaScript library for HTML5. Here are some key aspects:

  • CSS, DOM, jQuery (so JavaScript) as web assembly language but only JavaScript code needs to be written.
  • Functions like menu(), window(), dialog() manipulate the DOM to add elements (canvas, list etc) via jQuery under the hood.
  • A menu declaration corresponding to the ACE code’s Project menu (with Help and separator items omitted) follows, a key difference being that menu items are paired with callback functions (e.g. see sierpinski below), another being that there is no support for shortcut keys currently:
acejs.menu("Project", [
    ["Sierpinski Triangle", sierpinski],
    ["Square", square],
    ["Fern", fern],
    ["Tree #1", tree1],
    ["Tree #2", tree2],
    ["Sunflower", sunflower],
    ["About...", about]
]);
  • A window declaration that adds a framed canvas looks like this:
    • wdw_id = acejs.window(“IFS”, 640, 400);
  • and operations on the window make use of an ID:
    • acejs.clear(wdw_id);
    • acejs.pset(wdw_id, outX, outY, pixelColor);
  • Multiple menus and windows can be created.
  • acejs.css is almost non-existent currently. I’m sure someone who delights in CSS could make it look suitably dark and brooding with their eyes closed. I make no claim to have any special talent in web design.

There’s arguably no need for a compiler or interpreter. JavaScript’s iteration, selection, and expressions are adequate. Having said that, ACEjs could form the basis of a target if I ever chose to write another ACE compiler or interpreter (with all that spare time of mine).

With ACEjs you only have to write an app.js source file for your application and use a standard index.html that brings in your code and whatever else is needed, in particular acejs.css (trivial right now) and acejs.js. The only special thing you have to do is to define an init() function in app.js to be invoked by the framework. The best way to see how this works is to look at the example.

You can either download the contents of the public_html directory and open index.html in your browser or see the example application running here.

In early 2000 I wrote an article for Sky & Telescope (S&T) magazine’s astronomical computation column entitled Scripting: a programming alternative which proposed JavaScript as a suitable alternative to BASIC for astronomical computation, long used by S&T and others to disseminate programs. Even at that time, JavaScript was arguably the only programming language interpreter available on almost every personal computer, by virtue of the ubiquity of web browsers.

In essence, JavaScript had become the equivalent of the BASIC interpreter every old personal computer (formerly called microcomputers, especially in the 80s) once had. I made the example programs from the article available and experimented further; some examples show the original BASIC listing along with the JavaScript implementation.

A variant of the ideas that led to ACEjs are revealed in what I said on this page:

Peter Girard has suggested the creation of an ECMAScript code library for astronomical algorithms.

An idea I’ve had is to write a BASIC (which dialect: GWBASIC, QBasic, etc?) to ECMAScript translator, written in ECMAScript or Java. One could paste BASIC code into a text area on a web page, and have ECMAScript and HTML code generated on the fly. This would make the BASIC code on Sky & Telescope‘s web site available as interactive programs. Or, it could generate a listing, making Peter Girard’s idea of a code library easier to achieve.

Of course, there are now plenty of examples of BASIC interpreters written in JavaScript, e.g. here’s a QBasic implementation that generates bytecode and uses canvas. Then again, as I have noted, my aim was not to slavishly recreate the exact look & feel of the original platform.

S&T showed some initial interest in JavaScript, again in 2005 regarding an orbit viewer page I wrote that combined JavaScript, a Java applet and cross-domain AJAX while Internet Explorer allowed it, and before CORS was a thing.

Of course since then and for all kinds of reasons, JavaScript has come to dominate rich client browser applications, especially after the introduction of AJAX, and has generally become the assembly language of the web. More recently we’ve seen the rise of Node.js, an explosion of JavaScript web frameworks (Angular, React, …), and mobile JavaScript development frameworks like Apache Cordova. JavaScript has good points and bad along with detractors aplenty, but it’s hard to argue with its success.

History has shown that a programming language does not have to be perfect to succeed. I love C, but it’s far from perfect and holes in its type system allow one to, as the saying goes, “shoot one’s foot off”. Additionally, these same holes are responsible for security vulnerabilities in the operating systems we rely upon. Notice, I’m not saying that C itself is responsible (it’s not a person or a company) for exploits of those vulnerabilities; that’s attributable to the moral barrenness of the people involved. It’s unlikely that we’ll see the sum total of mainstream OS-land rewritten in safer languages (Rust, Haskell, …), to save us from ourselves, anytime soon.

But I digress…

I could repurpose ACE to generate JavaScript, but we are living in a time of “programming language plenty”. Creating a new language today should be considered a last resort. Domain Specific Languages, sure. Libraries and frameworks, perhaps. New languages? Looking at what’s available first before reinventing the wheel should be considered a responsibility. Also, a language is no longer enough by itself. You need an ecosystem of tools (IDE, debugger at least) and libraries for anyone to care enough to want to use your shiny new language beyond very simple programs. ACE had a couple of IDEs but no debugger. Heck, I didn’t even use a debugger when writing the compiler! Now I seem to live in source level debuggers. I’m obviously getting soft. 🙂

When I was a junior academic in the computing department at UTAS in the mid-90s, upon learning about my development of ACE, a senior and sometimes less-than-tactful colleague remarked that creating a new language was, as he so delicately put it, “a wank”. I disagreed. ACE was about providing the power of a compiled language for a particular platform (Amiga) to people who knew an interpreted language (AmigaBASIC), wanted to leverage that experience and existing code and didn’t feel confident enough to learn the dominant systems-level language of the time (C). It was also about improving the precursor language.

Now, I would agree that the decision to create a new programming language or library requires some circumspection, at the very least. But the programming language landscape has expanded a lot since the mid-90s. There is of course value in writing an interpreter or compiler, just for the learning as an end in itself and every computer science or software engineering student should do so.

So, in the end: why ACEjs?

In part because I wanted to explore simple ways to write or port a certain class of application (e.g. old ACE programs) to client-side web applications.

Partly out of a sense of nostalgia.

In part because I want to learn more JavaScript, Canvas, jQuery and jQuery-ui and the subtle differences between JavaScript versions.

Mostly, I wanted to get a bundle of ideas out of my system, which I’ve more or less done.

ACEjs is a simple starting point and if it’s useful to me, I’ll continue to improve it; if not, it will happily fade away. So far, I’ve tested it using Chrome version 56 and Safari version 9 and ECMAScript (the underlying JavaScript standard) 5 and 6.

Finally, look in the About box of the example application for a small dedication, also present in the even simpler About box of ifs.b; my wife spent far too long listening to me talk about programming languages and compilers in the 90s. Now the talk is more likely to be about variable stars. Thanks Karen. We both like ferns as well, IFS generated or natural. 🙂

In any case, enjoy. Feedback welcome.

Marvin Minsky (1927 to 2016)

January 31, 2016

MIT Artificial Intelligence (AI) pioneer Marvin Minsky died at the age of 88 on January 24th in Boston.

2006_marvin_minsky1

See this EE Times blog post for a good summary.

Marvin Minsky’s participation in the 1956 Dartmouth Conference along with John McCarthy, Nathaniel Rochester, and Claude Shannon gave rise to the term artificial intelligence. While considerable progress has been made in the domain of machine intelligence, Minsky’s book The Emotion Machine deals with some of the more recalcitrant aspects of human-level intelligence.

His work ranged widely, including early work in neural networks (perceptrons) or connectionism and symbolic or classical AI including expert systems. I took university classes in classical AI and enjoyed experimenting with neural networks in the 90s, but my knowledge representation Master’s thesis was very much in the symbolic camp.

Minsky provided advice for the movie 2001: A Space Odessey regarding the delusional HAL 9000 computer. He made this remark about the film:

Kubrick’s vision seemed to be that humans are doomed, whereas Clarke’s is that humans are moving on to a better stage of evolution.

I’ll end with more Minsky quotes that provide some insight into this influential man’s thought processes. I particularly like the final tongue-in-cheek comment.

No computer has ever been designed that is ever aware of what it’s doing; but most of the time, we aren’t either.

If you just have a single problem to solve, then fine, go ahead and use a neural network. But if you want to do science and understand how to choose architectures, or how to go to a new problem, you have to understand what different architectures can and cannot do.

I believed in realism, as summarized by John McCarthy’s comment to the effect that if we worked really hard, we’d have an intelligent system in from four to four hundred years.

Peter Naur’s passing

January 21, 2016

Danish computer science pioneer, Peter Naur, died on January 3 2016 after a short illness, aged 87.

peter_naur sourcehttp://www.naur.com

 

Peter Naur received the ACM Turing award in 2005 for “…fundamental contributions to programming language design and the definition of Algol 60, to compiler design, and to the art and practice of computer programming”.

He is best known as the original editor of the Algol 60 Report, and as the “N” in BNF or Backus-Naur Form (with John Backus of Fortran, and other, fame), first used to describe the syntax of Algol 60. He objected to this and thought BNF should denote Backus-Normal Form instead. Nevertheless, MacLennan (1983), in Principles of Programming Languages: Evaluation and Implementation), notes the following about the connection between BNF and Naur:

Peter Naur, then the editor of the Algol Bulletin, was surprised because Backus’s definition of Algol-58 did not agree with his interpretation of the Algol-58 report. He took this as an indication that a more precise method of describing syntax was required and prepared some samples of a variant of the Backus notation. As a result, this notation was adopted for the Algol-60 report…

I gave examples of BNF from the Report along with Algol code fragments in a talk I gave to the Australian Computer Society about the 50th anniversary of Algol 60. Compiler construction tools like lex & yacc arose from the creation of BNF and variations such as EBNF (with which I have spent more time) led to more expressive and concise programming language grammars and still more powerful tools.

Alan Perlis commented in 1978, with a pun on the begin and end keywords used to delimit code blocks, that:

Algol’s is the linguistic form most widely used in describing the new exotic algorithms…Where important new linguistic inventions have occurred, they have been imbedded usually within an Algol framework, e.g. records, classes, definitions of types and their operations,…, modules. Algol has become so ingrained in our thinking about programming that we almost automatically base investigations in abstract programming on an Algol representation to isolate, define, and explicate our ideas…It was a noble begin but never intended to be a satisfactory end.

Others have remarked upon the contribution of Algol:

Here is a language so far ahead of its time that it was not only an improvement on its predecessors but also on nearly all its successors. (1980 Turing Award Lecture, C.A.R. Hoare)

Lisp and Algol, are built around a kernel that  seems as natural as a branch of mathematics. (Metamagical Themas, Douglas Hofstadter)

Algol 60 lives on in the genes of Scheme and Pascal. (SICP, Abelson & Sussman)

Block structure, lexical scope, and recursion are just a few features that no Pascal, C/C++, Python, Java, C# or  programmer would find surprising. Naur and his collaborators played a large part in shaping the programming languages we think and code in today. Scheme is a dialect of Lisp — Lisp predating Algol — that was ultimately influenced by it, e.g. lexical scope and by the “language report” document approach (see Revised Report on the Algorithmic language Scheme).

As Perlis alludes to above, many of the beneficiaries of the descendants of Algol are unaware of how their language constrains the way in which they think about programs and programming, the Sapir-Whorf hypothesis in action.

The late Dutch computer scientist, Edsger Dijkstra remarked:

Several friends of mine, when asked to suggest a date of birth for Computing Science, came up with January 1960, precisely because it was Algol 60 that showed the first ways in which automatic computing could and should and did become a topic of academic concern.

Naur started his career as an astronomer, but changed his profession after encountering computers. He was not fond of the idea of programming as a branch of mathematics and saw it as very much a human activity, the sub-title of a 1992 book (Computing: A Human Activity) by Naur. Section 1.4 entitled Programming as Theory Building challenges the simplistic Agile mantra that source code is all that matters, whereas in fact, like JPEG image files, it can be seen as a lossy format, a distillation of a programmer’s thoughts with lost context, mitigated only in part by appropriate comments (an art form in itself).

Peternaur

sourcehttps://en.wikipedia.org/wiki/Peter_Naur

In Programming as Theory Building, Naur outlines scenarios relating to writing a compiler for similar languages and the development of a large real-time industrial monitoring system. In his words:

The conclusion seems inescapable that at least with certain kinds of large programs, the continued adaption, modification, and correction of errors in them, is essentially dependent on a certain kind of knowledge possessed by a group of programmers who are closely and continuously connected with them.

Naur was offered a professorship in computer science at the University of Copenhagen in 1969. He thought that computer science was fundamentally about the nature and use of data, didn’t much like the phrase and coined the term datology, which he thought had a more human orientation, giving rise to what has been called the Copenhagen interpretation of computer science (as opposed to quantum physics).

Later in his career, Peter Naur was critical of contemporary conceptions of science and philosophy, developed a theory of human thinking called Synapse-State Theory of Mental Life, contrasted human vs computer thinking (see this video) and rejected the possibility of strong AI. I may not agree with all of Naur’s ideas in this area, but consider them worth hearing.

As an aside, this Y-Combinator post about Naur’s passing makes the interesting observation that a number of widely used programming languages other than Algol 60 have Danish origins, e.g.

  • PHP (Rasmus Lerdorrf)
  • Turbo Pascal, Delphi, C# (Anders Hejlsberg)
  • C++ (Bjarne Stroustrup)

It has occurred to me increasingly in the last few years that many of our field’s pioneers are reaching the end of their lives. Even confining oneself to those associated with programming language design and implementation, since 2010, at least the following have died:

  • Denis Ritchie: C
  • John McCarthy: Lisp
  • Robin Milner: ML
  • Peter Naur: Algol

Before I started writing this post, I knew about Naur’s association with Algol 60 and BNF, but that’s about all. Now I’ve discovered that, like so many pioneers, he had depths I have only just started to explore, even if confining myself to his work Computing: A Human Activity.

 

Back to Future with (g)vi(m)

October 24, 2015

In the 90s I was pretty happy if you sat me down with a C compiler and vi. Any programmer, at least if exposed to a Unix variant of some kind, will know about the sometimes quirky but very powerful editor, vi. At UTAS as a computer systems officer (“jack of all trades, master of none”: coding, soldering, networking, software/hardware installation), I taught vi and other Unix-related topics to academics in short courses; later, as a junior academic, I continued to use it whenever on a Unix system. It may sound a bit sad, but one of the happiest 2 weeks of employment was spent in a portable office (terrapin style) with a DEC VT220 terminal, SunOS (or it may have been Solaris by then) Unix, a C compiler and vi, with which I was tasked to write the core of a student enrolment system at UTAS, still in use for several years after I left.

I also occasionally used emacs and, on pre-OS X Macs, whatever editor the IDE (integrated development environments, although I suspect they weren’t called that at the time) provided, for example: THINK C.

On the Amiga I used an editor that came with AmigaDOS (ED or EDIT was, I think, the name), MicroEMACS, and at least one port of vi. These, especially the fairly minimalist MicroEMACS, were perfectly fine for developing my ACE Amiga BASIC compiler.

I’ve also had a go at writing a simple editor or two, including one for a PICK mainframe email system I developed as an undergraduate project.

Upon finding the need to leave UTAS (due to university funding cuts around 1996) I was offered work with a Tasmanian Internet Service Provider (ISP) as a programmer and sysadmin, when things were just getting started in that area. Being paid to code in C and learning Perl kept me happy for while, although I knew at the start that I’d want something more than to work for an ISP but it was a great opportunity. Best of all I was able to provide employment for one of my former UTAS students there after I left, giving him a start in the game. There, as always, was vi, this time on FreeBSD and BSDi Unix systems.

When I started working for Motorola (and later Freescale Semiconductor) in 1997, I found a fairly even split between use of vi and emacs there (under Solaris), along with a high degree of religious adherence to one or the other, the kind of zeal that still accompanies the adherents of particular programming languages. I’ve always had a fascination for the LISP programming language, so emacs with its in-built LISP interpreter won points on that front, along with specific modes of use in the Motorola/Freescale environment.

These days I’m a bit of a generalist when it comes to both editors and programming languages (C, C++, Java, Python, R etc), although I have my favourites and those less favoured. That’s the subject of another post, I think.

On any given day, I could find myself using vi, emacs, Eclipse, Visual Studio, PyCharm and various other IDEs. On Unix (okay, Linux systems now) or cygwin I have for many years tended to use vi (okay, vim, its now dominant incarnation) for quick editing when I want an editor now and emacs for more complex editing. Despite the power of modern IDEs, they are, like modern operating systems, often slow resource hogs, and tend to leave me a bit cold. At least some of these have emacs and vi modes for their editors. There are other newcomer editors under Windows and Unix that while fine, don’t compel me to want to use them. To be fair to IDEs though, if you need non-trivial source-level debug, they’re hard to beat. Having said that, I’ll still break out a command-line debugger when it’s appropriate.

Now, as a software engineer with CSIRO, especially in Linux high performance compute (HPC) cluster environments coding in C++, I find myself using vi or in particular gvim (“graphical vi improved”) in that context more and more again. Once more I’m really loving its power and simplicity, including tags for source code navigation, split windows, and all the keyboard shortcut goodness that have always made vi fast and productive to work with. Also better from a resource usage point of view on a HPC system.

Maybe this is partly motivated by a nostalgic streak, but mostly arises from a pragmatic approach.

Anyway, sometimes improving on the past is not so easy or at best only incremental.

BASIC’s 50th, early micros, and ACE BASIC for the Amiga

May 4, 2014

I enjoyed reminiscing about BASIC when it recently turned 50, on May 1 2014. I learned more about the events surrounding the creation of Dartmouth BASIC from the Dartmouth web pages and especially from interview videos with co-inventors John Kemeny and Thomas Kurtz. Given my development of ACE BASIC for the Amiga in the mid-90s, the BASIC programming language has a special place in my heart. More about ACE shortly. My first experience with BASIC and programming in general was in 1977, in my second year of high school (Norwood High). Our class marked up a deck of cards (in pencil) with a BASIC program and submitted them to Angle Park Computing Centre. A week or two later I remember receiving a printout of a partial run, showing an ASCII plot of some function (a deceleration curve I think) tantalisingly cut short by an error, the details of which I don’t recall.

At the time I thought that was an interesting experience but set it aside. As I described here, in 1978, the school bought a PDP-11 and installed it in an air-conditioned room complete with a card reader, printer, and terminals. I remember seeing the machine for the first time, gawking in wonder through the glass window in the door to the room. 11_20_console_hirespdp11-software2-r   For the first 6 months most students were only allowed to create card decks rather than using a terminal. At least the turnaround time was less than for Angle Park: you could get your program run’s print-out, correct errors in your card deck and submit it again via the card reader’s hopper.

Apart from a small amount of class-time exposure to the machine, I became a “computer monitor”, assigned on a roster to be there while others used the computer, given a modicum of responsibility for looking after the machine (e.g. card reader or printer problems, booting) but I didn’t learn too much more about the PDP-11 that way.

What really hooked me, was eventually being allowed to use the terminals (pictured at right) and the interactive BASIC programming that entailed. There was plenty of competition for terminal time! One of the first interactive programs I wrote was a simple guess-the-number game in which the user was told whether a guess was less or greater than the number the machine was “thinking” of. It seems trivial now but that experience of interacting with an “artificial intelligence” (as it seemed to me at the time) was intoxicating and this has stayed with me. Some fellow students started playing around with machine language on the PDP-11; that was a little beyond me at the time but an understanding of that level would become important for me later.

In the late ’70s, Tandy had a computer store in Gawler Place, Adelaide. I used to catch a bus into town on Friday nights, pull up a chair at a TRS-80 Model 1 on display and sit there for an hour or two typing in BASIC source code for games from a book; the sales people didn’t seem to mind too much. 🙂

When I’d finished year 12 of high school, had started working as a nurse in 1981, and was earning money, I bought a CASIO FX-702P, essentially a calculator with an interface for a cassette recorder (for programs and data and printer that was programmable in BASIC. frontcvr220px-CW-E-frontWithin a year or so, I had a Sinclair ZX-81 connected to my parents’ old HMV black and white TV in the country (where I visited most weekends): a big screen indeed! This odd little machine fired my imagination via its space-age programming manual cover. Adding the 16K RAM expansion pack (shown below at rear) allowed much larger programs to be written compared to the unexpanded 1K machine. ZX81 Programming in BASIC while listening to music like Kraftwerk’s Computer World, with simplistic, German-accented lyrics like these:

I program my home computer. Beam myself into the future.

it somehow seemed that the future was coming fast and that it was going to be overwhelmingly positive. This was a time of innocent joy when nothing was standardised (hardware or operating systems), the term micro-computer was more likely to be used than personal computer, the sterile DOS-based IBM PC “business computer” was barely beginning to emerge and the Macintosh was not yet in sight.

The pages of magazines like Australian Personal Computer and Compute! were filled with BASIC program listings for specific machines just crying out to be adapted to other BASIC dialects. Reading books such as Christopher Evans’ The Mighty Micro (1979) filled me with optimism for the future. Reading Isaac Asimov’s I, Robot and the like fired my imagination, as did TV shows like Dr Who and Blake’s 7. To be honest, all of this was also somewhat of a welcome escape from the daily realities of being a young nurse.

My next machine was a Commodore PET (CBM 4016). Built like a Sherman tank, I added a 5.25″ floppy disk drive (that had cooling problems!) and a dot matrix printer via the PET’s IEEE interface. I happily spent many weekends creating games in BASIC on this computer. I also wrote a version of Eliza-the-psychotherapist that kindled an interest in artificial intelligence and language processing. Occasionally entering the PET’s machine language monitor programming got me thinking more about low-level concepts (processor registers etc). Reading a book called Programming the 6502 by Rodnay Zaks (1981) helped further my understanding. OLYMPUS DIGITAL CAMERA That PET was followed by a VIC-20 and C-64 into the mid-80s both of which I (of course) programmed in BASIC and a bit of hand-assembled 6502/6510 machine code POKEd into obscure areas of memory (such as the cassette buffer, not in use when a 5.25″ floppy disk drive was the secondary storage device). I started to gain some exposure to databases (SuperBase 64), word processors and other programming languages like Pascal. Interfacing with relay boards and sensors was also something I enjoyed using BASIC for with these machines, achieved by POKEing values into and PEEKing values from I/O memory locations. vic20C64_startup_animiert In 1987, after a couple of years going off in various directions, I moved from Adelaide to Tasmania to work as a nurse (in ICU, Recovery etc) where I met my future wife, Karen. I didn’t have any computer with me there because I initially thought I’d only stay for a year or so but ended up staying for a decade. My first computer purchase in Tasmania was an Acorn Electron, little brother to the BBC Micro and programmable in a BBC dialect of BASIC. I also learned a bit of LISP (from a cassette-loaded interpreter) using the Electron. Acorn_Electron_4x3 Commodore_Amiga_500Plus20110501_uae4all_(30-09-2011)_(amiga_500_emu_for_pandora)

By far the most important computer purchase ever for me was a Commodore Amiga 500. I learned so much from that machine, initially programming it in AmigaBASIC and smatterings of machine code, then in C and a handful of other languages. The Amiga’s pre-emptive multi-tasking operating system and state of the art graphics and sound capabilities were fantastic. It was also to this machine that I connected my first hard disk drive. I wrote simple astronomy programs, a simple drawing program for Karen, and created an alarm and security system with infra-red sensors, keypad, strobe light etc. It even woke me (or more likely Karen so she could come to my aid) up if I went for a sleep walk. 🙂 I also used the Amiga and C64 for the pre-Internet Viatel (Australia’s Teletex system), bulletin boards, and Compuserve.

I took a statistics course at UniTas (Launceston) for fun in 1987 and a year or so later had started an Applied Computing degree there. I took a double major in computer science and philosophy. This ultimately lead me away from a career in nursing and onto a software engineering career (after stints as a computer systems officer and a junior academic post-graduation). One of the subjects I took as an undergraduate was “Advanced Programming” in which we wrote a compiler for a subset of Pascal into p-codes (similar to UCSD p-codes and not unlike Java VM bytecodes) rather than assembly or machine code for the native machine (Intel). One outcome is that I became increasingly interested in programming language translation and programming paradigms (especially object oriented, functional, logic and concurrent). Another outcome is that I resolved to take that knowledge and write a compiler for the Amiga for a language that I myself would want to use, not just as an academic exercise.

In October 1991, I started development of ACE BASIC for the Commodore Amiga computer. It was released to testers in March 1992 and made available for public consumption in February 1993. Like the original Dartmouth BASIC, ACE was compiled, unlike many implementations that have been interpreters. ACE was a compiler for the interpreted Microsoft AmigaBASIC that shipped with the Amiga.

This article written for the online Amiga Addicts journal gives some idea of the history and motivations for ACE and here is an interview I gave in 1997 about ACE. Although the instruction set of the Amiga’s 68000 processor was not quite as orthogonal as the PDP-11’s, it was still really nice. ACE compiled BASIC source into peephole optimised 68000 assembly code.

KL_Thomson_TS68000

 

This was assembled to machine code by Charlie Gibbs’ A68K assembler and linked against library code with the Software Distillery’s Blink linker (later I also used PhxAsm and PhxLnk). I wrote 75% of ACE’s runtime libraries in 68000AssemblyLanguageProgramming_2ndEdition68000, later waking up to the idea that C would have been a more productive choice. One upside is that I became quite comfortable working with assembly language. I’ve made use of that comfort in recent years when working with hardware simulator testing (ARM, PowerPC etc) and micro-controller compilers.

A wonderful community of enthusiastic users built up around ACE. I wrote an integrated development environment, and a community member wrote one too (Herbert Breuer’s ACE IDE is shown below).

Another member wrote a “super-optimiser” that rewrote parts of ACE’s generated assembly code to be even faster than I managed with my simple optimisations.

aide HB

ACE was guilty of a criticism by the Dartmouth BASIC co-inventors (John Kemeny and Tom Kurtz) kemeny_and_kurtz_250pxlevelled at many BASICs since their first: of being machine-specific. But then that was the intent for ACE: to make programming the Amiga more approachable to more people, combining the simple abstractions of BASIC with the unique features of the Amiga and the run-time efficiency of a compiled language like C.

Given the Amiga’s demise, around 1996 I moved onto other platforms. I wrote a LISP interpreter for the Newton PDA (also doomed; I can pick ’em!) between 1998 and 2000. That was fun and had a nice small community associated with it, but it didn’t match ACE and its community.

I eventually came to possess PCs, programming them with a smattering of GW-BASIC, quite a lot of Turbo Pascal, some Microsoft Quick C, a little assembly, and Visual BASIC.

When Java appeared in 1996 I greeted it with enthusiasm and have been an advocate of it and the Java Virtual Machine, as a professional and spare-time software developer, on and off ever since. These days I’m more likely to code in Java, C/C++, Python (where once I would have used Perl) or perhaps R rather than a BASIC dialect, none of which denigrates BASIC.

The fact is that BASIC made early microcomputers accessible such that many of us interacted with them in ways more directly than is possible with modern computers (PCs and Macs), despite all their advantages and power. Arguably, we expected less from the machines yet engaged in highly creative relationships with them. Anyone who has spent much time programming will recognise the allure. The interactive nature of these early BASIC machines only added to this.

I agree with the famous Dutch computer scientist Edsger Dijkstra when he says that:

Computing Science is no more about computers than astronomy is about telescopes.

dijkstra

I also sympathise with his declaration that the BASIC statement GOTO could be considered harmful, due to the “spaghetti code” it leads to. But I don’t really agree with his assessment that:

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

I heard the same propaganda from a University lecturer. Apparently some us of were able to be “rehabilitated”.  Then again, along with his comments about BASIC, Dijkstra made some unkind comments about other programming languages, including COBOL, Fortran, and APL, for example:

The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.

With apologies to Grace Murray Hopper, I have more sympathy with this last one. 🙂

archives_hopper

The truth is that all programming languages are crude approximations of the Platonic ideal of bringing together two minds: one artificial, one natural. There are no programming languages about which I can say: there are no improvements possible here, and there are very few languages that make communion with the machine a beautiful experience. All are to be reviled and admired for different qualities. But BASIC, in all its dialects, with all its flaws, and with the benefit of 20/20 hindsight and large gobs of nostalgia, was mostly to be admired and was directly responsible for many of us falling in love with the idea and activity of programming.

programming_languages_tower_of_babel_poster-red3cfd7db90b4be9ab4a803dee3a65b3_wxb_8byvr_512

Viva la BASIC! Happy 50th anniversary!

An Arduino Olympic Torch model

February 5, 2014

The 2012 London Olympic Games coincided with my son’s last year of primary school. Each year the school has an “expo” in which students explore and create displays on some theme. In that year the school decided the theme would be the Olympics.

One idea we (my son, wife and I) came up with was a model of an olympic torch with a “flickering flame” effect. This was just one of a number of items that had to be prepared by my son for the “expo”, and some parental assistance was expected, so we helped with that.

My wife created the body of the torch from a cone-shaped industrial cotton reel and  cardboard coffee cup, all spray-painted gold.

I considered using a PIC micro-controller and a couple of discrete tri-colour high-intensity LEDs, but given the time and size constraints I decided to use a Freetronics LeoStick Arduino with two high-intensity LED units on a ProtoStick (from Jaycar in Adelaide) powered by a 9V battery. The LeoStick sits inside the coffee cup with cellophane over the top to provide a diffuser (added by my wife), as shown below.

ArduinoOlympicTorchTopView-cropped

A momentary push-button switch is used to start the “flickering flame” for a short time as shown in this YouTube video.

The serially clocked tri-colour high intensity LED units randomly and rapidly cycle through various colours to produce the effect.

This Freetronics page describes the LED components and their usage in more detail.

The following image shows the individual disassembled components:

ArduinoOlympicTorchComponents-cropped

Note the large (by comparison to the LeoStick) home-brew 5V regulated power supply to which a 9V battery is connected. The power supply and battery sit inside the conical handle attached to a toggle switch (power on/off) in the base, a water bottle lid in this case.

The following image shows the components connected:

ArduinoOlympicTorchCompleteCircuit-cropped

The high-intensity LED units are attached to the ProtoStick via headers soldered to the latter.

The following schematic shows the simple Arduino-based circuit:

arduino-olympic-torch-schematic

The resistor in the schematic is a surface mount component soldered to the ProtoStick.

The code for the project is shown below. As can be seen from the excluded code at the end of the loop() function, I toyed with adding a wake-from-sleep interrupt to conserve power while the circuit was active, but didn’t get that to function within the time available. There was no loss of functionality of course, just a greater drain on the battery because of the unfortunate busy-looping.

/*
  * 2012 EMPS Expo Olympic Torch RGB LEDs, for Nicholas.
  *
  * Arduino (LeoStick)
  *
  * References:
  * - http://www.freetronics.com/pages/leostick-getting-started-guide#.UBOpjJkthmo
  * - http://www.freetronics.com/pages/rgbled-rgb-led-quickstart-guide#.UAq-qZktg38
  * - http://www.arduino.cc/en/Tutorial/Debounce
  *
  * dbenn@computer.org, August 2012
  */

 #include <avr/sleep.h>

 // RGB LED strip pins (clock and serial data).
 const int CKI = 10;
 const int SDI = 12;

 // Button pin
 const int BUTTON = 2;

 // Button state variables.
 int buttonState = LOW;
 int lastButtonState = LOW; 

 // The following variables are longs because the time, measured in miliseconds,
 // will quickly become a bigger number than can be stored in an int.
 long lastDebounceTime = 0;  // the last time the output pin was toggled
 const long DEBOUNCE_DELAY = 50;  // the debounce time; increase if output is unstable

 // Function prototypes.
 void wakeupHandler();
 void blankLEDStrip();
 void randomiseLEDStrip(long millisecs);
 void setLEDStrip(int color1, int color2);
 void pushLED(int color);
 boolean isButtonToggled();
 boolean isButtonPressed();
 long getRndColor();

 void setup() {
   pinMode(BUTTON, INPUT);
   pinMode(SDI, OUTPUT);
   pinMode(CKI, OUTPUT);
   randomSeed(analogRead(1));
   blankLEDStrip();
 }

 void loop() {
   // When the button is toggled, generate
   // random colors with random pauses between
   // each for 15 seconds, then blank the LED strip.
   if (isButtonToggled()) {
     randomiseLEDStrip(15000);
     blankLEDStrip();
   }
 #if 0
   // A sleep mode with wake-from-interrupt would decrease power consumption.
   set_sleep_mode(SLEEP_MODE_PWR_DOWN);
   sleep_enable();
   interrupts();
   attachInterrupt(0, wakeupHandler, RISING);
   sleep_mode();
 #endif
 }
 #if 0
 void wakeupHandler() {
   // When the button is toggled, generate
   // random colors with random pauses between
   // each for 15 seconds, then blank the LED strip.
 //  while (!isButtonToggled());
   randomiseLEDStrip(15000);
   blankLEDStrip();
 }
 #endif
 // Blank the RGB LED strip.
 void blankLEDStrip() {
     setLEDStrip(0x00000000, 0x00000000);
 }

 // Generate random colors for random periods for specified number of seconds.
 void randomiseLEDStrip(long millisecs) {
   long t1 = millis();

   while (millis() <= t1+millisecs) {
     setLEDStrip(getRndColor(), getRndColor());
     delay(random(100));    
   }
 }

 // Set the colors on the RGB LED strip.
 void setLEDStrip(int color1, int color2)
 {
   pushLED(color1);
   pushLED(color2);

   // Pull clock low to put LED strip into reset/post mode
   // and wait for 500 microseconds for completion.
   digitalWrite(CKI, LOW);
   delayMicroseconds(500);
 }

 // Push a 24-bit RGB colour data (RRGGBB; red is MSB) out to the LED strip.
 void pushLED(int color) {
   for (int color_bit = 23 ; color_bit >= 0 ; color_bit--) {

     // Low clock signals bit change start.
     digitalWrite(CKI, LOW);

     // Force mask to be 32-bit instead of 16-but by default.
     long mask = 1L << color_bit;

     // Write data bit and latch.
     digitalWrite(SDI, color & mask ? HIGH : LOW);
     digitalWrite(CKI, HIGH);
   }
 }

 // Returns whether the button was toggled (pressed then released).
 boolean isButtonToggled() {
   boolean toggled = false;

   if (isButtonPressed()) {
     while (isButtonPressed());
     toggled = true;
   }

   return toggled;
 }

 // Returns whether the button was pressed since the last time the function was called.
 boolean isButtonPressed() {
   // Read the state of the switch into a local variable.
   int currentButtonState = digitalRead(BUTTON);

   // Check to see if we just pressed the button 
   // (i.e. the input went from LOW to HIGH), and we've waited 
   // long enough since the last press to ignore any noise.  

   // If the switch changed, due to noise or pressing...
   if (currentButtonState != lastButtonState) {
     // ...reset the debouncing timer.
     lastDebounceTime = millis();
   } 

   if ((millis() - lastDebounceTime) > DEBOUNCE_DELAY) {
     // Whatever the current button state is, it's been there for longer
     // than the debounce delay, so take it to be the actual current state.
     buttonState = currentButtonState;
   }

   // Save the current button state. Next time through the loop,
   // it'll be the lastButtonState.
   lastButtonState = currentButtonState;

   return buttonState == HIGH;
 }

 // Return a random color.
 // Proportions of each color component can be tweaked in the selection comparisons.
 long getRndColor() {
   long primary = random(100);
   long rgb = 0xFFFFFF;

   if (primary < 50) {
       rgb = 0xFF0000 | random(0x100) << 8 | random(0x100); 
   } else if (primary < 75) {
       rgb = 0x00FF00 | random(0x100) << 16 | random(0x100); 
   } else {
       rgb = 0x0000FF | random(0x100) << 16 | random(0x100) << 8; 
   }

   return rgb;
 }