Archive for the ‘Uncategorized’ Category

Nova near Southern Cross

January 15, 2018

Rob Kaufman in Victoria discovered a possible nova (PNV J11261220-6531086) near the Southern Cross (Crux) in the constellation of Musca on January 14 2018. All novae start out having the designation PNV or possible nova.

Rob’s discovery visual estimate was magnitude 7. I estimated it tonight with 7×50 binoculars at magnitude 6.7 relative to magnitude 6.4 and 7.1 comparison stars.

This context screenshot from Stellarium shows the nova’s location (cross-hairs at upper middle of image) relative to familiar stellar sign posts, including Crux and Alpha Muscae at 10pm Adelaide time (AEDT).
PNV J11261220-6531086 wide

The next shows a narrower field of view with the nova at right of the helpful triangular, A-shaped asterism.

PNV J11261220-6531086 narrow

Here’s a 10º finder chart from AAVSO
X22594EOand an 8º finder chart with the orientation closer to that of the sky around tonight’s observation time. The two comparison stars I used are circled in red.

X22594EI

After submitting my observation tonight to AAVSO I noticed that since Rob’s discovery observation, only two have been submitted other than mine:

  • another visual estimate by Sebastian Otero in Argentina (6.85);
  • and a tri-colour green DSLR observation (6.72) by David Blane in South Africa.

What I love about such transients, is their spectacular brightness rise and unpredictability.

Initial spectroscopy by Rob indicates a classical nova. I’d expect to see more amateur spectroscopy of this object in the near future.

Will it become visible to the naked eye like the similarly southern and close-to-Crux V1369 Cen did in 2013 (peaking at around magnitude 3.4)? One never knows with these things but it’s worth noting, as per the CBAT transient report page, ASAS-SN observations suggest the nova may actually have started in the first few days of January. If so, perhaps we’re a little too far down the track to expect naked eye visibility. All we can do is to observe it and see!

Being such a southerly object, it will not be as well observed as novae in the northern hemisphere, but it’s in a great location, so have a go if you can! I’ll be out every clear night observing it when I can in the days to come, visually and possibly via DSLR.

ACE, optimisation and an old friend

December 29, 2017

In late November I received an email from an old friend: Sean Miller. Sean was a member of the wonderful community that built up around ACE BASIC, a compiler for the Amiga I developed as a labour of love between 1991 and 1996. I’ve written about ACE in this blog before. Sean told me how his use of ACE influenced him over the years. It has been great to get back in contact with him.

I felt honoured and humbled when, on Christmas Eve, Sean released an episode about ACE on the Raising Awesome YouTube channel he and his son have created. In this episode (Retro Amiga Computing – ACE BASIC and Questions2.1 Development):

Sean shows how to use ACE on an Amiga emulator to compile and run a program he wrote more than 20 years ago (Questions).

Retro Computing with ACE

I’ve expressed this in email to Sean, but let me say it publicly: thank you Sean! It means more to me than I can say.

During the video, Sean comments on the progress of the compilation of Question, notes that there were around 4000 peephole optimisations (see screenshot from video above) and wonders whether I might explain what a peephole optimisation is. I’d be happy to of course. Now you’ve got me started! 🙂

ACE generates assembly code for the beautiful Motorola 68000 microprocessor. Compilation of some ACE language constructs generates sub-optimal assembly code instructions on the first pass. Assembly code is emitted as ACE parses the input source code without any knowledge of the broader context of the program.

Here’s a trivial ACE program:

x%=42
y%=x%*3

This simply stores 42 in the short integer variable x% (the type is denoted by %), multiplies x% by 3 and stores the product in the variable y%. I chose integer over floating point for this example because the generated assembly is more complex and would distract from the explanation. Speaking of distractions…

As an aside, unlike modern Intel, ARM and other processors, the 68000 didn’t have a floating point unit (FPU), so floating point operations were carried out by library code instead of hardware, such as a Motorola Fast Floating Point or IEEE 754 library. As an aside to my aside, the Amiga 500 had a 68000 processor whereas the Amiga 1200 (I owned both eventually) had a 68020. The 68020 could offload floating point instructions (which it did not know how to handle) to a co-processor. The 68040 was the first 68k processor with an on-board FPU. This is a whole topic by itself.

Back to the trivial example ACE program above…

Here’s the 68000 assembly ACE generates for the two line program without any optimisation (i.e. without the -O option):

    move.w  #42,-(sp)
    move.w  (sp)+,-2(a4)
    move.w  -2(a4),-(sp)
    move.w  #3,-(sp)
    move.w  (sp)+,d0
    move.w  (sp)+,d1
    muls    d1,d0
    move.l  d0,-(sp)
    move.l  (sp)+,d0
    move.w  d0,-(sp)
    move.w  (sp)+,-4(a4)

With optimisation we have 6 assembly instructions instead of 11:

    move.w  #42,-2(a4)
    move.w  -2(a4),-(sp)
    move.w  #3,d0
    move.w  (sp)+,d1
    muls    d1,d0
    move.w  d0,-4(a4)

Looking at the first two lines of the 11 unoptimised sequence:

    move.w  #42,-(sp)
    move.w  (sp)+,-2(a4)
lifo_stack1
Example stack operations (source: goo.gl/5EuhjG)

ACE examines this pair in a sliding window, or so-called peephole, onto the emitted instructions and notices that 42 is being pushed to the first-in, last-out stack then immediately popped from the stack and stored into the variable x%’s address, represented by an offset of two from an address stored in the register a4. The peephole optimiser reduces this push-pop pair to a single instruction:

    move.w  #42,-2(a4)

ACE stops short of taking the newly optimised pair:

    move.w  #42,-2(a4)
    move.w  -2(a4),-(sp)

then peephole optimising it and emitting this:

    move.w  #42,-(sp)

The reason is that the programmer has asked for 42 to be stored in the variable x%.

More ideally would have been this sequence:

move.w  #42,-2(a4)
move.w  -2(a4),d0
muls    #3,d0
move.w  d0,-4(a4)

which literally says:

  • move 42 into variable x%’s memory location
  • move the value stored at x%’s memory location into the 68k register d0
  • carry out a signed multiplication of 3 with the contents of register d0, storing the result in d0
  • move the contents of register d0 into variable y%’s memory location

If the constraints relating to use of x% and y% did not exist, the following would be sufficient to yield the product of 42 and 3 in 68k assembly:

move.w  #42,d0
muls    #3,d0

Notice that the 4 instructions after the multiplication (muls) in the unoptimised sequence are optimised during more than one pass over the assembly code to a single instruction that stores the product into y%, from this:

    move.l  d0,-(sp)
    move.l  (sp)+,d0
    move.w  d0,-(sp)
    move.w  (sp)+,-4(a4)

to this:

    move.w  d0,-4(a4)

So, ACE does better with this than the instruction sequence before the multiplication.

There are other simple optimisations carried out when the -O option is used, relating to numeric negation, but this example illustrates the key aspects.

Bernd Brandes later wrote a more powerful optimiser for ACE, the SuperOptimiser, that built upon this simple peephole optimisation approach.

Every instruction the processor doesn’t have to execute means fewer CPU cycles, so a run-time speed up. This matters a lot for example, when such instructions are part of a loop that iterates many times.

To revisit ACE’s code generation and optimisation implementation, I downloaded and Vidar Hokstad’s improvements to the ACE source (on GitHub) for compilation under Linux. I compiled that on my Mac OS X laptop and used it to generate 68k assembly code. Vidar contacted me several years ago to say that he was engaging in “software archaeology” (that made me feel a bit old, even then) with the ACE source code. I appreciate Vidar’s efforts. He sorted out some compilation problems under the GNU C compiler (gcc) that I would have had to otherwise.

It’s interesting to look at the Intel assembly generated by gcc for a similar C code fragment. The following would have to be embedded in a function:

int x,y;
x=42;
y=x*3;

The gcc compiler generates this sequence:

    movl    $0, -4(%rbp)
    movl    $42, -8(%rbp)
    imull   $3, -8(%rbp), %ecx
    movl    %ecx, -12(%rbp)

As with the ACE generated 68k assembly, only the relevant part is shown. There’s additional code generated just to start up and shut down a program (by gcc, ACE or any other compiler). The Intel assembly generated here is a bit better than the optimised 68k code ACE generated (4 vs 6 lines) although surprisingly, not very much better.

When I wrote ACE in the 90s, all components were written either in C or 68000 assembly and I went straight from an implicit parse tree to assembly code generation. These days I tend to use ANTLR or similar tools for lexical analysis (converting character streams to tokens) and parsing (checking against language grammar). I have yet to use The LLVM Compiler Infrastructure for language development, but that’s on my list too.

Creating an intermediate representation (such as abstract syntax trees) before code generation, provides additional opportunities for optimisation, something I’ve been exploring in recent times. I’ll write more about that in another post.

To be honest, the more I think and write about this topic again, the more I want to.

Thanks again Sean.

Roy & I

March 11, 2017

Roy Austen (1953-2017), a former colleague, died a few days ago on March 5.

A friend recently told me that Roy had been diagnosed with cancer in January, although he had actually been unwell for months before then.

Not long after the diagnosis, Roy set up a GoFundMe page for medical expenses and for the ongoing care of his son, in preparation for the inevitable.

I really did mean to get in contact, but I got busy and Roy died before I did. At least there was still the fund…

Roy’s main line of work and his passion was photography, but that’s not how we got to know one another.

I bought my first Windows (3.1) PC from his family business, KM Computers.

Then, awhile later, he offered me a job and became my boss…

By the end of 1995 I was in search of my next job after 5 years at the University of Tasmania (UTAS) in Launceston as a computer systems officer then a junior academic in the Department of Applied Computing. A lot of university contracts weren’t being renewed around that time.

Luckily for me, Roy had recently started Vision Internet, one of a small but growing number of competing Internet Service Providers (ISPs) in Tasmania. It was a small business arising from KM Computers at a time when Internet access was still dial-up, ISDN/ISDL was the fastest you could hope for (128 Kbps), but most people just had a dial-up modem, giving up to around 56 Kbps, less in practice. Vision Internet started in Launceston but quickly added points of presence in other parts of the state, including an Internet Cafe.

In 1995 while still at UTAS, I had helped Roy out by writing a basic customer time accounting program in C that read utmp/wtmp logs and generated simple output when someone else had difficulty doing so.

By 1996, Roy needed a programmer and system administrator and I needed a job. Before accepting Roy’s job offer, I was up front with him that I would probably want to do something different after about 18 months. That happened with Karen and I moving to Adelaide in 1997 where I spent 10 years with Motorola. That move was more difficult than I expected, and at least as hard as Karen knew it would be. In the end, it was a good move.

Ironically, UTAS asked me to come back for some occasional part-time tutoring soon after I started working for Roy, which may have been less economical than if they’d just renewed my contract!

Vision Internet was good while it lasted. To be honest, for the first few months, I couldn’t believe I was being paid to write C  (and Perl) code, something I enjoyed doing anyway. 🙂

The compact machine room doubled as my office for the first year or so before we moved down the road to a more spacious building; not as cushy as my office at UTAS. I actually didn’t mind the machine room too much. A terminal with function key accessible “virtual consoles”, the vi editor, command-line shell, a C compiler, and a Perl interpreter kept me pretty happy. Roy was fine with me working from home occasionally as time went by too. He expected me to keep things rolling and solve problems as quickly as possible, but he was good to me and we got along pretty well.

There were only about half a dozen people working at Vision Internet, fewer early on. Everyone pitched in. Roy and I didn’t always see eye to eye though. For example, at one point we disagreed about who should have super-user privileges; more than I would have liked for a brief time. 🙂

I experienced a number of things during my time with Roy at Vision Internet and learned lessons from some:

  • Early mobile phones were fairly bulky. 🙂 Roy gave me my first mobile before I started in the job. Of course, this meant he could contact me whenever. He didn’t abuse that though. A former UTAS colleague took one look at the phone hanging off my belt and declared amusingly: “wanker phone”. 🙂 Even worse when a larger battery was added! Still, I appreciated Roy giving me my first mobile.
  • You can’t always spend time doing what you want in a job, even one you mostly like, unless you’re very lucky. I guess I already knew that from being a nurse in the 80s. I had no real interest in sysadmin tasks like applying security patches to BSD Unix kernels, maintaining backups, chasing hackers, worrying about what dodgy things people might be doing with our systems or customer sales, credit card transactions, help desk (shades of The IT Crowd: “is your modem plugged in?”). I mostly wanted to design, code, and test software. Still do. That’s largely why I told Roy I thought I’d want to move on after about 18 months. Having said that, a fair amount of my time was spent writing software in the form of a suite of customer time usage programs, each prefixed with tu, written in C and Perl. We also eventually sold tu to another local ISP.
  • The practical difference between code that uses a search-based processing algorithm over a linear data structure that runs in polynomial vs logarithmic time – O(n^2) vs O(n log n). This matters a lot as the number of customer records (n) increases when your task is to write a program that processes customer time usage once per day and obviously before the next day starts. To simplify: given a processing time of a second per customer, n≈300 can mean the difference between a run that takes a day instead of an hour. You can make incremental changes to the processing time per customer (t), but eventually you’ll hit a point where n is too large, e.g. when n=1000 and t is 0.1 seconds. Anyway, I don’t recall what our n and t were, but we hit such a limit with a tu program. When I realised what was going on and fixed it, Roy was delighted and relieved. I was pretty happy too and thanked my computer science education, in particular, the discipline of computational complexity.

Before I left to go work at Motorola, I made sure Roy wasn’t going to be left without someone in my role. This gave one of my former UTAS students (Craig Madden) the opportunity he needed to break into the industry; it turned out well for Roy and Vision too.

At the height of Vision Internet, I remember occasional staff gatherings at Roy’s. He was a good host and I think he mostly enjoyed that period, despite the worry that must’ve accompanied running a business. He was generally optimistic and trusted those he employed. He had his moments, like the rest of us, when he was unhappy or angry, but mostly, he was a good guy to be around.

If I could do so, I’d tell him this:

Roy, I’m really sorry you’re gone and that I didn’t make the time to get in contact. In recent years, I should have told you how much I appreciated the opportunity you gave me a long time ago. Upon reflection, after time spent together at Vision and elsewhere, I think we would have used the word “friend” (now distorted by social media) to describe our relationship, not just “colleague”, even if we didn’t say so. I should have told you that too.

ASASSN-16ma update

November 9, 2016

As mentioned in yesterday’s updated post (with finder chart), conditions last night were less than ideal, but when the clouds cleared enough, I estimated the nova’s magnitude at 6.1.

asassn-16manov10

 

300 variable star observations

September 8, 2016

I recently passed 300 variable star observations, having started in 2010.

301 obs

That’s a tiny number compared with prolific Australian visual observers like Rod Stubbings or those doing CCD photometry (e.g. of eclipsing binary variable stars) who quickly reach the thousands, such as fellow ASSA member Robert Jenkins or AAVSO’s Mike Simonsen.

Still, I’m pleased with my few hundred plodding individual observations of 16 variable stars, namely:

  • pulsating variables: R Car, l Car, W Sgr, X Sgr, L2 Pup, eta Aql, alf Ori
  • novae: T Pyx, V1369 Cen, V339 Del, Nova Sgr 2015 No. 2
  • eclipsing binaries: zet Phe, BL Tel, V Pup, eps Aur
  • the massive, once naked eye visible, unstable star: eta Car

Most of these are visual observations, and most of those were with 7×50 binoculars:

 

264 visual obs

I started making DSLR photometry observations in early 2015 after taking Mark Blackford’s first AAVSO CHOICE course on the subject:

36 DSLR obs

While visual estimates are quick and convenient in a time-poor life, photometry requires some effort from imaging through to processing and analysis, but the additional accuracy and error characterisation are satisfying, as is the ability to capture multiple bands in a single image, Johnson V and B primarily.

My last DSLR submission was in April. I’m looking forward to some nicer weather so I can get out and do more soon, in addition to the visual estimates I sneak in between the clouds.

Another Gravitational Wave detection

June 16, 2016

Another gravitational wave detection by LIGO has been announced!

See https://www.ligo.caltech.edu/news/ligo20160615

The June 15 announcement page points out that while the signal was weaker than the first detection due to the black hole masses being smaller (14 and 8 solar masses vs 36 and 29):

…when these lighter black holes merged, their signal shifted into higher frequencies bringing it into LIGO’s sensitive band earlier in the merger than we observed in the September event. This allowed us to observe more orbits than the first detection–some 27 orbits over about one second (this compares with just two tenths of a second of observation in the first detection). Combined, these two factors (smaller masses and more observed orbits) were the keys to enabling LIGO to detect a weaker signal. They also allowed us to make more precise comparisons with General Relativity. Spoiler: the signal agrees, again, perfectly with Einstein’s theory.

The news release continues:

Our next observing interval – Observing Run #2, or “O2” – will start in the Fall of 2016. With improved sensitivity, we expect to see more black hole coalescences, and possibly detect gravitational waves from other sources, like binary neutron-star mergers. We are also looking forward to the Virgo detector joining us later in the O2 run. Virgo will be enormously helpful in locating sources on the sky, collapsing that ring down to a patch, but also helping us understand the sources of gravitational waves.

Gravitational Wave astronomy does seem to have arrived!

 

Another early morning rainbow

February 23, 2016

About to ride to the bus station this morning; greeted by another nice rainbow:

 

Brought to us by the laws of physics. No pot of gold evident. Pity.

My first DSLR photometry observation

February 3, 2015

After more than 5 months of gearing up to do DSLR photometry and two AAVSO CHOICE courses later, I submitted my first DSLR photometry observation yesterday: R Car, magnitude 5.054 with an error of 0.025.

First Submission

With a visual observation, the best I could have said that it was magnitude 5.1 with an error of a tenth of a magnitude. With DSLR photometry I can get down to a hundredth of a magnitude. A CCD could do better (a thousandth) but for ten times the cost, or more.

Yet another step in the evolution of my hobby. Thanks to Mark Blackford for sharing his considerable experience.

 

 

 

Total Lunar Eclipse!

October 8, 2014

Despite a fairly cloudy sky here in Adelaide, it gradually cleared to provide a nice view of tonight’s total lunar eclipse.

I took more than 150 images with my recently purchased Canon 1100D DSLR, a few of these from a tripod with a 100mm f2.0 lens. Most, like this one were taken with the 1100D through my LX-90 8″ SCT scope at prime focus.

IMG_00373 second exposure, ISO 1600 at 20:46 ACDT

I’ll post more when I’ve had the chance to take a closer look.

See IceInSpace for some more of my images.

BASIC’s 50th, early micros, and ACE BASIC for the Amiga

May 4, 2014

I enjoyed reminiscing about BASIC when it recently turned 50, on May 1 2014. I learned more about the events surrounding the creation of Dartmouth BASIC from the Dartmouth web pages and especially from interview videos with co-inventors John Kemeny and Thomas Kurtz. Given my development of ACE BASIC for the Amiga in the mid-90s, the BASIC programming language has a special place in my heart. More about ACE shortly. My first experience with BASIC and programming in general was in 1977, in my second year of high school (Norwood High). Our class marked up a deck of cards (in pencil) with a BASIC program and submitted them to Angle Park Computing Centre. A week or two later I remember receiving a printout of a partial run, showing an ASCII plot of some function (a deceleration curve I think) tantalisingly cut short by an error, the details of which I don’t recall.

At the time I thought that was an interesting experience but set it aside. As I described here, in 1978, the school bought a PDP-11 and installed it in an air-conditioned room complete with a card reader, printer, and terminals. I remember seeing the machine for the first time, gawking in wonder through the glass window in the door to the room. 11_20_console_hirespdp11-software2-r   For the first 6 months most students were only allowed to create card decks rather than using a terminal. At least the turnaround time was less than for Angle Park: you could get your program run’s print-out, correct errors in your card deck and submit it again via the card reader’s hopper.

Apart from a small amount of class-time exposure to the machine, I became a “computer monitor”, assigned on a roster to be there while others used the computer, given a modicum of responsibility for looking after the machine (e.g. card reader or printer problems, booting) but I didn’t learn too much more about the PDP-11 that way.

What really hooked me, was eventually being allowed to use the terminals (pictured at right) and the interactive BASIC programming that entailed. There was plenty of competition for terminal time! One of the first interactive programs I wrote was a simple guess-the-number game in which the user was told whether a guess was less or greater than the number the machine was “thinking” of. It seems trivial now but that experience of interacting with an “artificial intelligence” (as it seemed to me at the time) was intoxicating and this has stayed with me. Some fellow students started playing around with machine language on the PDP-11; that was a little beyond me at the time but an understanding of that level would become important for me later.

In the late ’70s, Tandy had a computer store in Gawler Place, Adelaide. I used to catch a bus into town on Friday nights, pull up a chair at a TRS-80 Model 1 on display and sit there for an hour or two typing in BASIC source code for games from a book; the sales people didn’t seem to mind too much. 🙂

When I’d finished year 12 of high school, had started working as a nurse in 1981, and was earning money, I bought a CASIO FX-702P, essentially a calculator with an interface for a cassette recorder (for programs and data and printer that was programmable in BASIC. frontcvr220px-CW-E-frontWithin a year or so, I had a Sinclair ZX-81 connected to my parents’ old HMV black and white TV in the country (where I visited most weekends): a big screen indeed! This odd little machine fired my imagination via its space-age programming manual cover. Adding the 16K RAM expansion pack (shown below at rear) allowed much larger programs to be written compared to the unexpanded 1K machine. ZX81 Programming in BASIC while listening to music like Kraftwerk’s Computer World, with simplistic, German-accented lyrics like these:

I program my home computer. Beam myself into the future.

it somehow seemed that the future was coming fast and that it was going to be overwhelmingly positive. This was a time of innocent joy when nothing was standardised (hardware or operating systems), the term micro-computer was more likely to be used than personal computer, the sterile DOS-based IBM PC “business computer” was barely beginning to emerge and the Macintosh was not yet in sight.

The pages of magazines like Australian Personal Computer and Compute! were filled with BASIC program listings for specific machines just crying out to be adapted to other BASIC dialects. Reading books such as Christopher Evans’ The Mighty Micro (1979) filled me with optimism for the future. Reading Isaac Asimov’s I, Robot and the like fired my imagination, as did TV shows like Dr Who and Blake’s 7. To be honest, all of this was also somewhat of a welcome escape from the daily realities of being a young nurse.

My next machine was a Commodore PET (CBM 4016). Built like a Sherman tank, I added a 5.25″ floppy disk drive (that had cooling problems!) and a dot matrix printer via the PET’s IEEE interface. I happily spent many weekends creating games in BASIC on this computer. I also wrote a version of Eliza-the-psychotherapist that kindled an interest in artificial intelligence and language processing. Occasionally entering the PET’s machine language monitor programming got me thinking more about low-level concepts (processor registers etc). Reading a book called Programming the 6502 by Rodnay Zaks (1981) helped further my understanding. OLYMPUS DIGITAL CAMERA That PET was followed by a VIC-20 and C-64 into the mid-80s both of which I (of course) programmed in BASIC and a bit of hand-assembled 6502/6510 machine code POKEd into obscure areas of memory (such as the cassette buffer, not in use when a 5.25″ floppy disk drive was the secondary storage device). I started to gain some exposure to databases (SuperBase 64), word processors and other programming languages like Pascal. Interfacing with relay boards and sensors was also something I enjoyed using BASIC for with these machines, achieved by POKEing values into and PEEKing values from I/O memory locations. vic20C64_startup_animiert In 1987, after a couple of years going off in various directions, I moved from Adelaide to Tasmania to work as a nurse (in ICU, Recovery etc) where I met my future wife, Karen. I didn’t have any computer with me there because I initially thought I’d only stay for a year or so but ended up staying for a decade. My first computer purchase in Tasmania was an Acorn Electron, little brother to the BBC Micro and programmable in a BBC dialect of BASIC. I also learned a bit of LISP (from a cassette-loaded interpreter) using the Electron. Acorn_Electron_4x3 Commodore_Amiga_500Plus20110501_uae4all_(30-09-2011)_(amiga_500_emu_for_pandora)

By far the most important computer purchase ever for me was a Commodore Amiga 500. I learned so much from that machine, initially programming it in AmigaBASIC and smatterings of machine code, then in C and a handful of other languages. The Amiga’s pre-emptive multi-tasking operating system and state of the art graphics and sound capabilities were fantastic. It was also to this machine that I connected my first hard disk drive. I wrote simple astronomy programs, a simple drawing program for Karen, and created an alarm and security system with infra-red sensors, keypad, strobe light etc. It even woke me (or more likely Karen so she could come to my aid) up if I went for a sleep walk. 🙂 I also used the Amiga and C64 for the pre-Internet Viatel (Australia’s Teletex system), bulletin boards, and Compuserve.

I took a statistics course at UniTas (Launceston) for fun in 1987 and a year or so later had started an Applied Computing degree there. I took a double major in computer science and philosophy. This ultimately lead me away from a career in nursing and onto a software engineering career (after stints as a computer systems officer and a junior academic post-graduation). One of the subjects I took as an undergraduate was “Advanced Programming” in which we wrote a compiler for a subset of Pascal into p-codes (similar to UCSD p-codes and not unlike Java VM bytecodes) rather than assembly or machine code for the native machine (Intel). One outcome is that I became increasingly interested in programming language translation and programming paradigms (especially object oriented, functional, logic and concurrent). Another outcome is that I resolved to take that knowledge and write a compiler for the Amiga for a language that I myself would want to use, not just as an academic exercise.

In October 1991, I started development of ACE BASIC for the Commodore Amiga computer. It was released to testers in March 1992 and made available for public consumption in February 1993. Like the original Dartmouth BASIC, ACE was compiled, unlike many implementations that have been interpreters. ACE was a compiler for the interpreted Microsoft AmigaBASIC that shipped with the Amiga.

This article written for the online Amiga Addicts journal gives some idea of the history and motivations for ACE and here is an interview I gave in 1997 about ACE. Although the instruction set of the Amiga’s 68000 processor was not quite as orthogonal as the PDP-11’s, it was still really nice. ACE compiled BASIC source into peephole optimised 68000 assembly code.

KL_Thomson_TS68000

 

This was assembled to machine code by Charlie Gibbs’ A68K assembler and linked against library code with the Software Distillery’s Blink linker (later I also used PhxAsm and PhxLnk). I wrote 75% of ACE’s runtime libraries in 68000AssemblyLanguageProgramming_2ndEdition68000, later waking up to the idea that C would have been a more productive choice. One upside is that I became quite comfortable working with assembly language. I’ve made use of that comfort in recent years when working with hardware simulator testing (ARM, PowerPC etc) and micro-controller compilers.

A wonderful community of enthusiastic users built up around ACE. I wrote an integrated development environment, and a community member wrote one too (Herbert Breuer’s ACE IDE is shown below).

Another member wrote a “super-optimiser” that rewrote parts of ACE’s generated assembly code to be even faster than I managed with my simple optimisations.

aide HB

ACE was guilty of a criticism by the Dartmouth BASIC co-inventors (John Kemeny and Tom Kurtz) kemeny_and_kurtz_250pxlevelled at many BASICs since their first: of being machine-specific. But then that was the intent for ACE: to make programming the Amiga more approachable to more people, combining the simple abstractions of BASIC with the unique features of the Amiga and the run-time efficiency of a compiled language like C.

Given the Amiga’s demise, around 1996 I moved onto other platforms. I wrote a LISP interpreter for the Newton PDA (also doomed; I can pick ’em!) between 1998 and 2000. That was fun and had a nice small community associated with it, but it didn’t match ACE and its community.

I eventually came to possess PCs, programming them with a smattering of GW-BASIC, quite a lot of Turbo Pascal, some Microsoft Quick C, a little assembly, and Visual BASIC.

When Java appeared in 1996 I greeted it with enthusiasm and have been an advocate of it and the Java Virtual Machine, as a professional and spare-time software developer, on and off ever since. These days I’m more likely to code in Java, C/C++, Python (where once I would have used Perl) or perhaps R rather than a BASIC dialect, none of which denigrates BASIC.

The fact is that BASIC made early microcomputers accessible such that many of us interacted with them in ways more directly than is possible with modern computers (PCs and Macs), despite all their advantages and power. Arguably, we expected less from the machines yet engaged in highly creative relationships with them. Anyone who has spent much time programming will recognise the allure. The interactive nature of these early BASIC machines only added to this.

I agree with the famous Dutch computer scientist Edsger Dijkstra when he says that:

Computing Science is no more about computers than astronomy is about telescopes.

dijkstra

I also sympathise with his declaration that the BASIC statement GOTO could be considered harmful, due to the “spaghetti code” it leads to. But I don’t really agree with his assessment that:

It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

I heard the same propaganda from a University lecturer. Apparently some us of were able to be “rehabilitated”.  Then again, along with his comments about BASIC, Dijkstra made some unkind comments about other programming languages, including COBOL, Fortran, and APL, for example:

The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense.

With apologies to Grace Murray Hopper, I have more sympathy with this last one. 🙂

archives_hopper

The truth is that all programming languages are crude approximations of the Platonic ideal of bringing together two minds: one artificial, one natural. There are no programming languages about which I can say: there are no improvements possible here, and there are very few languages that make communion with the machine a beautiful experience. All are to be reviled and admired for different qualities. But BASIC, in all its dialects, with all its flaws, and with the benefit of 20/20 hindsight and large gobs of nostalgia, was mostly to be admired and was directly responsible for many of us falling in love with the idea and activity of programming.

programming_languages_tower_of_babel_poster-red3cfd7db90b4be9ab4a803dee3a65b3_wxb_8byvr_512

Viva la BASIC! Happy 50th anniversary!