Archive for the ‘Uncategorized’ Category

Comet 2001/A1 Leonard

January 4, 2022
Dec 26, Canon 1100D, 100mm lens, piggybacked on Meade LX-90, ISO 200, f4, 90 secs

Comet Leonard has been putting on a show for us in the southern hemisphere since mid to late December 2021 and I’ve taken images with my modest equipment, a Canon 1100D with 100mm lens, and exposures of either a few seconds on a tripod or >= 30 second exposures with the camera piggybacked on my 20 year old (hard to believe) 8″ Schmidt-Cassegrain Meade LX-90.

These are all from my fairly light-polluted suburban home site, at least low in the western sky, so it’s a battle between gain (ISO), exposure time, f-stop.

Dec 25, Canon 1100D, 100mm lens, ISO 800, f2, 7 secs; a short satellite trail is evident at right

The following shows the comet’s movement with respect to reference stars over 6 days.

Comet’s path with respect to background stars (arrowed) from Dec 25 to Dec 30

I’ve been posting images to Facebook, Instagram, and Twitter mostly.

Michael Mattiazo’s Southern Comets page has some amazingly detailed images of the comet (look for Leonard in the left-hand side panel) taken with much better equipment and under darker skies.

Having just passed perihelion and with Luna encroaching on the night sky darkness, the days of good views of Comet Leonard are numbered. It’s been nice to have this over the holidays though.

I find that from time to time, I need events like this to reinvigorate my enthusiasm.

RS Ophiuchi talk for The Astronomical Society of South Australia

September 4, 2021

A few days ago, on September 1 2021, I gave a short talk (15 minutes) to the Astronomical Society of South Australia about RS Ophiuchi and the 10 known recurrent novae more generally.

The talk starts at the 1 hour 7 minute 39 second mark (cued up here) in the YouTube video after the main speaker, Associate Professor Gary Hill, who gave a talk about observing astronomical phenomena with high energy neutrinos using a neutrino observatory he helped to construct in Antarctica. He was a hard act to follow.

On the subject of RS Ophiuchi, here is an updated visual light curve for September 4 2021 with my binocular observations in purple.

One thing I talked about was the most recent and the next predicted outbursts of recurrent novae. I neglected to add U Scorpii to the list which is predicted to go into outburst in the next year or two. Unlike T Corona Borealis which is likely to reach magnitude 2.5 or 3 sometime before 2026, U Sco will only be magnitude 7.5 or 8 at maximum (starting from around magnitude 18), fading by several magnitudes within a week.

Nova Ret 2020 Update

July 18, 2020

The sky was largely cloudy at 5:45 this morning but a short-lived clear patch permitted a quick observation. The nova’s magnitude was between the 4.6 and 5.0 comparison stars in Reticulum (X25537DR chart), so around magnitude 4.8.

The AAVSO alert notice was issued a couple of days ago.

In my last post, I included a Stellarium screenshot of the field around the nova and Reticulum. Here’s a couple more for better context:

An hour later, I took another quick shot of the dawn NE sky showing the Moon, Venus and cloud (0.6 sec, ISO 200, f 2.0, unprocessed):

Cult of Stupidity: Naming Names

February 13, 2020

Grant Foster, as always, using clear, simple statistical arguments to show why those who say the data doesn’t support the warming trend are mistaken.

Nova near Southern Cross

January 15, 2018

Rob Kaufman in Victoria discovered a possible nova (PNV J11261220-6531086) near the Southern Cross (Crux) in the constellation of Musca on January 14 2018. All novae start out having the designation PNV or possible nova.

Rob’s discovery visual estimate was magnitude 7. I estimated it tonight with 7×50 binoculars at magnitude 6.7 relative to magnitude 6.4 and 7.1 comparison stars.

This context screenshot from Stellarium shows the nova’s location (cross-hairs at upper middle of image) relative to familiar stellar sign posts, including Crux and Alpha Muscae at 10pm Adelaide time (AEDT).
PNV J11261220-6531086 wide

The next shows a narrower field of view with the nova at right of the helpful triangular, A-shaped asterism.

PNV J11261220-6531086 narrow

Here’s a 10º finder chart from AAVSO
X22594EOand an 8º finder chart with the orientation closer to that of the sky around tonight’s observation time. The two comparison stars I used are circled in red.

X22594EI

After submitting my observation tonight to AAVSO I noticed that since Rob’s discovery observation, only two have been submitted other than mine:

  • another visual estimate by Sebastian Otero in Argentina (6.85);
  • and a tri-colour green DSLR observation (6.72) by David Blane in South Africa.

What I love about such transients, is their spectacular brightness rise and unpredictability.

Initial spectroscopy by Rob indicates a classical nova. I’d expect to see more amateur spectroscopy of this object in the near future.

Will it become visible to the naked eye like the similarly southern and close-to-Crux V1369 Cen did in 2013 (peaking at around magnitude 3.4)? One never knows with these things but it’s worth noting, as per the CBAT transient report page, ASAS-SN observations suggest the nova may actually have started in the first few days of January. If so, perhaps we’re a little too far down the track to expect naked eye visibility. All we can do is to observe it and see!

Being such a southerly object, it will not be as well observed as novae in the northern hemisphere, but it’s in a great location, so have a go if you can! I’ll be out every clear night observing it when I can in the days to come, visually and possibly via DSLR.

ACE, optimisation and an old friend

December 29, 2017

In late November I received an email from an old friend: Sean Miller. Sean was a member of the wonderful community that built up around ACE BASIC, a compiler for the Amiga I developed as a labour of love between 1991 and 1996. I’ve written about ACE in this blog before. Sean told me how his use of ACE influenced him over the years. It has been great to get back in contact with him.

I felt honoured and humbled when, on Christmas Eve, Sean released an episode about ACE on the Raising Awesome YouTube channel he and his son have created. In this episode (Retro Amiga Computing – ACE BASIC and Questions2.1 Development):

Sean shows how to use ACE on an Amiga emulator to compile and run a program he wrote more than 20 years ago (Questions).

Retro Computing with ACE

I’ve expressed this in email to Sean, but let me say it publicly: thank you Sean! It means more to me than I can say.

During the video, Sean comments on the progress of the compilation of Questions, notes that there were around 4000 peephole optimisations (see screenshot from video above) and wonders whether I might explain what a peephole optimisation is. I’d be happy to of course. Now you’ve got me started! 🙂

ACE generates assembly code for the beautiful Motorola 68000 microprocessor. Compilation of some ACE language constructs generates sub-optimal assembly code instructions on the first pass. Assembly code is emitted as ACE parses the input source code without any knowledge of the broader context of the program.

Here’s a trivial ACE program:

x%=42
y%=x%*3

This simply stores 42 in the short integer variable x% (the type is denoted by %), multiplies x% by 3 and stores the product in the variable y%. I chose integer over floating point for this example because the generated assembly is more complex and would distract from the explanation. Speaking of distractions…

As an aside, unlike modern Intel, ARM and other processors, the 68000 didn’t have a floating point unit (FPU), so floating point operations were carried out by library code instead of hardware, such as a Motorola Fast Floating Point or IEEE 754 library. As an aside to my aside, the Amiga 500 had a 68000 processor whereas the Amiga 1200 (I owned both eventually) had a 68020. The 68020 could offload floating point instructions (which it did not know how to handle) to a co-processor. The 68040 was the first 68k processor with an on-board FPU. This is a whole topic by itself.

Back to the trivial example ACE program above…

Here’s the 68000 assembly ACE generates for the two line program without any optimisation (i.e. without the -O option):

    move.w  #42,-(sp)
    move.w  (sp)+,-2(a4)
    move.w  -2(a4),-(sp)
    move.w  #3,-(sp)
    move.w  (sp)+,d0
    move.w  (sp)+,d1
    muls    d1,d0
    move.l  d0,-(sp)
    move.l  (sp)+,d0
    move.w  d0,-(sp)
    move.w  (sp)+,-4(a4)

With optimisation we have 6 assembly instructions instead of 11:

    move.w  #42,-2(a4)
    move.w  -2(a4),-(sp)
    move.w  #3,d0
    move.w  (sp)+,d1
    muls    d1,d0
    move.w  d0,-4(a4)

Looking at the first two lines of the 11 unoptimised sequence:

    move.w  #42,-(sp)
    move.w  (sp)+,-2(a4)
lifo_stack1
Example stack operations (source: goo.gl/5EuhjG)

ACE examines this pair in a sliding window, or so-called peephole, onto the emitted instructions and notices that 42 is being pushed to the first-in, last-out stack then immediately popped from the stack and stored into the variable x%’s address, represented by an offset of two from an address stored in the register a4. The peephole optimiser reduces this push-pop pair to a single instruction:

    move.w  #42,-2(a4)

ACE stops short of taking the newly optimised pair:

    move.w  #42,-2(a4)
    move.w  -2(a4),-(sp)

then peephole optimising it and emitting this:

    move.w  #42,-(sp)

The reason is that the programmer has asked for 42 to be stored in the variable x%.

More ideally would have been this sequence:

move.w  #42,-2(a4)
move.w  -2(a4),d0
muls    #3,d0
move.w  d0,-4(a4)

which literally says:

  • move 42 into variable x%’s memory location
  • move the value stored at x%’s memory location into the 68k register d0
  • carry out a signed multiplication of 3 with the contents of register d0, storing the result in d0
  • move the contents of register d0 into variable y%’s memory location

If the constraints relating to use of x% and y% did not exist, the following would be sufficient to yield the product of 42 and 3 in 68k assembly:

move.w  #42,d0
muls    #3,d0

Notice that the 4 instructions after the multiplication (muls) in the unoptimised sequence are optimised during more than one pass over the assembly code to a single instruction that stores the product into y%, from this:

    move.l  d0,-(sp)
    move.l  (sp)+,d0
    move.w  d0,-(sp)
    move.w  (sp)+,-4(a4)

to this:

    move.w  d0,-4(a4)

So, ACE does better with this than the instruction sequence before the multiplication.

There are other simple optimisations carried out when the -O option is used, relating to numeric negation, but this example illustrates the key aspects.

Bernd Brandes later wrote a more powerful optimiser for ACE, the SuperOptimiser, that built upon this simple peephole optimisation approach.

Every instruction the processor doesn’t have to execute means fewer CPU cycles, so a run-time speed up. This matters a lot for example, when such instructions are part of a loop that iterates many times.

To revisit ACE’s code generation and optimisation implementation, I downloaded and Vidar Hokstad’s improvements to the ACE source (on GitHub) for compilation under Linux. I compiled that on my Mac OS X laptop and used it to generate 68k assembly code. Vidar contacted me several years ago to say that he was engaging in “software archaeology” (that made me feel a bit old, even then) with the ACE source code. I appreciate Vidar’s efforts. He sorted out some compilation problems under the GNU C compiler (gcc) that I would have had to otherwise.

It’s interesting to look at the Intel assembly generated by gcc for a similar C code fragment. The following would have to be embedded in a function:

int x,y;
x=42;
y=x*3;

The gcc compiler generates this sequence:

    movl    $0, -4(%rbp)
    movl    $42, -8(%rbp)
    imull   $3, -8(%rbp), %ecx
    movl    %ecx, -12(%rbp)

As with the ACE generated 68k assembly, only the relevant part is shown. There’s additional code generated just to start up and shut down a program (by gcc, ACE or any other compiler). The Intel assembly generated here is a bit better than the optimised 68k code ACE generated (4 vs 6 lines) although surprisingly, not very much better.

When I wrote ACE in the 90s, all components were written either in C or 68000 assembly and I went straight from an implicit parse tree to assembly code generation. These days I tend to use ANTLR or similar tools for lexical analysis (converting character streams to tokens) and parsing (checking against language grammar). I have yet to use The LLVM Compiler Infrastructure for language development, but that’s on my list too.

Creating an intermediate representation (such as abstract syntax trees) before code generation, provides additional opportunities for optimisation, something I’ve been exploring in recent times. I’ll write more about that in another post.

To be honest, the more I think and write about this topic again, the more I want to.

Thanks again Sean.

Roy & I

March 11, 2017

Roy Austen (1953-2017), a former colleague, died a few days ago on March 5.

A friend recently told me that Roy had been diagnosed with cancer in January, although he had actually been unwell for months before then.

Not long after the diagnosis, Roy set up a GoFundMe page for medical expenses and for the ongoing care of his son, in preparation for the inevitable.

I really did mean to get in contact, but I got busy and Roy died before I did. At least there was still the fund…

Roy’s main line of work and his passion was photography, but that’s not how we got to know one another.

I bought my first Windows (3.1) PC from his family business, KM Computers.

Then, awhile later, he offered me a job and became my boss…

By the end of 1995 I was in search of my next job after 5 years at the University of Tasmania (UTAS) in Launceston as a computer systems officer then a junior academic in the Department of Applied Computing. A lot of university contracts weren’t being renewed around that time.

Luckily for me, Roy had recently started Vision Internet, one of a small but growing number of competing Internet Service Providers (ISPs) in Tasmania. It was a small business arising from KM Computers at a time when Internet access was still dial-up, ISDN/ISDL was the fastest you could hope for (128 Kbps), but most people just had a dial-up modem, giving up to around 56 Kbps, less in practice. Vision Internet started in Launceston but quickly added points of presence in other parts of the state, including an Internet Cafe.

In 1995 while still at UTAS, I had helped Roy out by writing a basic customer time accounting program in C that read utmp/wtmp logs and generated simple output when someone else had difficulty doing so.

By 1996, Roy needed a programmer and system administrator and I needed a job. Before accepting Roy’s job offer, I was up front with him that I would probably want to do something different after about 18 months. That happened with Karen and I moving to Adelaide in 1997 where I spent 10 years with Motorola. That move was more difficult than I expected, and at least as hard as Karen knew it would be. In the end, it was a good move.

Ironically, UTAS asked me to come back for some occasional part-time tutoring soon after I started working for Roy, which may have been less economical than if they’d just renewed my contract!

Vision Internet was good while it lasted. To be honest, for the first few months, I couldn’t believe I was being paid to write C  (and Perl) code, something I enjoyed doing anyway. 🙂

The compact machine room doubled as my office for the first year or so before we moved down the road to a more spacious building; not as cushy as my office at UTAS. I actually didn’t mind the machine room too much. A terminal with function key accessible “virtual consoles”, the vi editor, command-line shell, a C compiler, and a Perl interpreter kept me pretty happy. Roy was fine with me working from home occasionally as time went by too. He expected me to keep things rolling and solve problems as quickly as possible, but he was good to me and we got along pretty well.

There were only about half a dozen people working at Vision Internet, fewer early on. Everyone pitched in. Roy and I didn’t always see eye to eye though. For example, at one point we disagreed about who should have super-user privileges; more than I would have liked for a brief time. 🙂

I experienced a number of things during my time with Roy at Vision Internet and learned lessons from some:

  • Early mobile phones were fairly bulky. 🙂 Roy gave me my first mobile before I started in the job. Of course, this meant he could contact me whenever. He didn’t abuse that though. A former UTAS colleague took one look at the phone hanging off my belt and declared amusingly: “wanker phone”. 🙂 Even worse when a larger battery was added! Still, I appreciated Roy giving me my first mobile.
  • You can’t always spend time doing what you want in a job, even one you mostly like, unless you’re very lucky. I guess I already knew that from being a nurse in the 80s. I had no real interest in sysadmin tasks like applying security patches to BSD Unix kernels, maintaining backups, chasing hackers, worrying about what dodgy things people might be doing with our systems or customer sales, credit card transactions, help desk (shades of The IT Crowd: “is your modem plugged in?”). I mostly wanted to design, code, and test software. Still do. That’s largely why I told Roy I thought I’d want to move on after about 18 months. Having said that, a fair amount of my time was spent writing software in the form of a suite of customer time usage programs, each prefixed with tu, written in C and Perl. We also eventually sold tu to another local ISP.
  • The practical difference between code that uses a search-based processing algorithm over a linear data structure that runs in polynomial vs logarithmic time – O(n^2) vs O(n log n). This matters a lot as the number of customer records (n) increases when your task is to write a program that processes customer time usage once per day and obviously before the next day starts. To simplify: given a processing time of a second per customer, n≈300 can mean the difference between a run that takes a day instead of an hour. You can make incremental changes to the processing time per customer (t), but eventually you’ll hit a point where n is too large, e.g. when n=1000 and t is 0.1 seconds. Anyway, I don’t recall what our n and t were, but we hit such a limit with a tu program. When I realised what was going on and fixed it, Roy was delighted and relieved. I was pretty happy too and thanked my computer science education, in particular, the discipline of computational complexity.

Before I left to go work at Motorola, I made sure Roy wasn’t going to be left without someone in my role. This gave one of my former UTAS students (Craig Madden) the opportunity he needed to break into the industry; it turned out well for Roy and Vision too.

At the height of Vision Internet, I remember occasional staff gatherings at Roy’s. He was a good host and I think he mostly enjoyed that period, despite the worry that must’ve accompanied running a business. He was generally optimistic and trusted those he employed. He had his moments, like the rest of us, when he was unhappy or angry, but mostly, he was a good guy to be around.

If I could do so, I’d tell him this:

Roy, I’m really sorry you’re gone and that I didn’t make the time to get in contact. In recent years, I should have told you how much I appreciated the opportunity you gave me a long time ago. Upon reflection, after time spent together at Vision and elsewhere, I think we would have used the word “friend” (now distorted by social media) to describe our relationship, not just “colleague”, even if we didn’t say so. I should have told you that too.

ASASSN-16ma update

November 9, 2016

As mentioned in yesterday’s updated post (with finder chart), conditions last night were less than ideal, but when the clouds cleared enough, I estimated the nova’s magnitude at 6.1.

asassn-16manov10

 

300 variable star observations

September 8, 2016

I recently passed 300 variable star observations, having started in 2010.

301 obs

That’s a tiny number compared with prolific Australian visual observers like Rod Stubbings or those doing CCD photometry (e.g. of eclipsing binary variable stars) who quickly reach the thousands, such as fellow ASSA member Robert Jenkins or AAVSO’s Mike Simonsen.

Still, I’m pleased with my few hundred plodding individual observations of 16 variable stars, namely:

  • pulsating variables: R Car, l Car, W Sgr, X Sgr, L2 Pup, eta Aql, alf Ori
  • novae: T Pyx, V1369 Cen, V339 Del, Nova Sgr 2015 No. 2
  • eclipsing binaries: zet Phe, BL Tel, V Pup, eps Aur
  • the massive, once naked eye visible, unstable star: eta Car

Most of these are visual observations, and most of those were with 7×50 binoculars:

 

264 visual obs

I started making DSLR photometry observations in early 2015 after taking Mark Blackford’s first AAVSO CHOICE course on the subject:

36 DSLR obs

While visual estimates are quick and convenient in a time-poor life, photometry requires some effort from imaging through to processing and analysis, but the additional accuracy and error characterisation are satisfying, as is the ability to capture multiple bands in a single image, Johnson V and B primarily.

My last DSLR submission was in April. I’m looking forward to some nicer weather so I can get out and do more soon, in addition to the visual estimates I sneak in between the clouds.

Another Gravitational Wave detection

June 16, 2016

Another gravitational wave detection by LIGO has been announced!

See https://www.ligo.caltech.edu/news/ligo20160615

The June 15 announcement page points out that while the signal was weaker than the first detection due to the black hole masses being smaller (14 and 8 solar masses vs 36 and 29):

…when these lighter black holes merged, their signal shifted into higher frequencies bringing it into LIGO’s sensitive band earlier in the merger than we observed in the September event. This allowed us to observe more orbits than the first detection–some 27 orbits over about one second (this compares with just two tenths of a second of observation in the first detection). Combined, these two factors (smaller masses and more observed orbits) were the keys to enabling LIGO to detect a weaker signal. They also allowed us to make more precise comparisons with General Relativity. Spoiler: the signal agrees, again, perfectly with Einstein’s theory.

The news release continues:

Our next observing interval – Observing Run #2, or “O2” – will start in the Fall of 2016. With improved sensitivity, we expect to see more black hole coalescences, and possibly detect gravitational waves from other sources, like binary neutron-star mergers. We are also looking forward to the Virgo detector joining us later in the O2 run. Virgo will be enormously helpful in locating sources on the sky, collapsing that ring down to a patch, but also helping us understand the sources of gravitational waves.

Gravitational Wave astronomy does seem to have arrived!