Enums for hard-coded configuration

While writing my new Stretch Timer app, I needed to set lower and upper limits for three key numbers. I had them hardcoded as numeric literals but when I wanted to change one of them, it occurred to me there must be a better way.

After doing a bit of research on enums, I found a construct that not only factors the numbers out to a single place, but also makes their use a lot more readable.

enum pickerRange {
    static let Stretch = (min: 5, max: 60)
    static let Rest = (min: 5, max: 60)
    static let Repeat = (min: 3, max: 15)

It is fairly plain from the enum declaration what is being described. It is also fairly evident what’s going on when they are referenced, too.

return pickerRange.Stretch.max - pickerRange.Stretch.min + 1


When we observe the world around us we are frequently to be found comparing what we see with our previous experiences. “That’s a lot of rain” is not an absolute measure of the volume of rain without considering the observer – do they come from Thailand or southern California? So it should come as no surprise that the technology we are already familiar with has a significant bearing on our assessment of new gadgets. It recently occurred to me that my own judgement has been affected in this way and I think it’s a common affliction.

My revelation came from my recent phone upgrade. In late 2014 I sold off my beloved iPhone 5 and “went big” with the iPhone 6 Plus. I remember the first week or two of using the giant new phone. I had pain in my hands and wrists until I got used to dealing with the size of the screen. Then, for three years, it was simply “my phone” and gave me no real problems except when it came to pocketability.

When I decided to upgrade this year, I opted for the iPhone 8 over the 8 Plus. The dual camera system of the Plus is not a real drawcard for me (I have a real camera) so that slight issue of pocketability lead me to the “regular” sized phone this time around. It didn’t hurt that it was $150 cheaper, either. I ran my 6 Plus in “zoomed” mode for the last few weeks before the upgrade to get used to the amount of ‘stuff’ that would fit on the screen and now I’m perfectly happy with my new iPhone 8. It seems normal to me in every way now.

The revelation comes from observing the now reasonably high number of Plus-sized iPhones I see in public. They now look huge to me! My initial reaction many times has been disbelief that people are carrying these enormous devices around that are so much bigger than any iPhone; bigger than the device I have only just stopped carrying myself. But careful observation confirms that these are indeed iPhones Plus.

The effect was clarified for me when I observed my son’s girlfriend using her iPhone 5S. It just looked like a “smaller” iPhone. Yet I remember well when I was using my 6 Plus, that troubleshooting issues on my wife’s 5C felt comical; I felt like I was working with an iPod Nano-sized device. It seems it does not take me long to get used to whatever sized phone I use every day and consider all other devices relative to it.

At the time of launch of the 6 Plus, there were a lot of people online claiming the phone was simply too big for “ordinary people.” I still know people who claim this. And yet, from my observations out in public, the Plus-sized phones are predominant among those who I have to assume are “normal people.” The phones haven’t shrunk and I don’t think pockets have become bigger. Certainly, hands haven’t. I believe what has changed is more people have experience of the larger iPhones now that they’ve been out for several years.

There have been other aspects of iPhones for which I have seen this effect in my online reading. Touch ID was going to be pathetic, or at least an unknown quantity, before it was released because all existing fingerprint readers were rubbish. Face ID seems to have followed the same pattern, although some smart commentators had spotted the Touch ID parallel.

Then there’s the much vaunted iPhone camera versus a real camera. It still astounds me the number of times I read that iPhones have amazing cameras. They don’t. Honestly, they produce very average images compared to most dedicated cameras. Your photos might look great on the phone but put them up full screen on any retina Mac and they will show their flaws readily. Photos from my $2500-ish camera setup are night and day better than anything an iPhone can produce. Even my “hundred dollar camera” can produce shots roughly equivalent at wide angle and they are markedly superior when zoomed due to its optical zoom capability. There is no denying the convenience of having a moderately capable camera on your phone – because it’s always in your pocket or purse – but they’re only good cameras for a phone.

I could add examples of software and even data plans but you get my point. Some people believe what they have is fine and anything more or different is unnecessary or overkill. Sometimes it is. If what you have does the job, then good for you. But progress is made by branching out.

I use a Mac because I wondered whether it was better than Windows. I started with a desktop (iMac) then upgraded to a laptop because I thought that would be better. Now I’ve gone back to an iMac and I know why it’s the better choice for me. I’ve even added a 12″ MacBook to my repertoire because I know what roles a laptop can fill for me.

I use Adobe Photoshop Lightroom because I wondered if it could do more than just collect my photos in folders. I use Affinity Photo because I wondered if it could be as good as Photoshop. I used Photoshop because I wondered if it was more powerful than PaintShop Pro.

I have had the luxury of funds to make all these changes and seek out different solutions over the years. Not everyone is so lucky. Which is why trial software is so important and, if you can, get your hands on different types of hardware to actually try them for some real tasks. Most importantly, think objectively about technology.

Banner image by leg0fenris.

Using regular expressions in QShell

It’s pretty rare that I publish anything about my work, but given the difficulty I had in figuring out this particular problem with online research (in the end, I only found the answer by experimenting) I thought it would be useful to others if I published the solution.

One of the first problems I always encounter when searching industry standard technologies as they apply to the IBM i platform is that the name of the platform is incredibly hard to include in search terms, so I’ll helpfully mention here that this also applies if your search terms are AS/400, AS400, iSeries, System i, or i5/OS. Heck, I’ll even throw in a gratuitous eServer reference!

With that out of the way, the problem to be solved: using QShell to apply Unix text manipulation commands to a stream file, making use of the power of regular expressions. In my case I had downloaded a CSV file from a public site and needed to take care of some formatting prior to using the CPYFRMIMPF command to load it into a database file. I’d used this method before, but hit an additional snag this time around.

There were three problems to solve with the file, all of which I knew I could attack with sed. If you’re not familiar with sed, here’s a brief introduction to how I use it.

sed -e 's/find this/replace with this/'

You can probably figure out from the example what it will do. It’s a simple search and replace (the initial ‘s’ means ‘substitute’). As with most Unix commands, in this default form it will take standard input, filter it, and write the result to standard output. Later on I’ll hook those up to the files I need.

The next thing to know for this example is how regular expressions work. That’s way too deep a subject for me to cover here. If you need to learn this, I recommend having a read of parts 17 and 18 of Bart Busschots’ Taming the Terminal series. I’ll be using some basic features plus back references to capture groups. They will be deployed within the sed command.

So to the first problem. The first two fields in each record were actually numeric but provided as quoted values, so CPYFRMIMPF would treat them as strings and complain about the target fields being numeric. Also, there was a requirement to concatenate these two values (ironically, as if they were strings) as a new value at the end of each record. The goal, then, to strip the quotes and to append a new value on the end of the record. Back references to the rescue.

sed -e 's/^"([0-9]{1,3})","([0-9]{1,4})"(.*)$/\1,\2\3,\1\2/'

That looks pretty complex but if you break it down, it’s simply looking for two quoted groups of digits. (I specified the lengths but could have opted for lazy matching instead.)

But here’s the key reason I’m writing this article – as written above, the command will not work in QShell. It’s relatively common to need to escape capture parentheses thus: \(  … \) and this is indeed required in QShell. But even then it won’t work. The trick that took me a good hour of experimenting to discover is that QShell also requires the cardinality braces { } to be escaped! I’ve not seen this on any other platform. Here’s what actually works.

sed -e 's/^"\([0-9]\{1,3\}\)","\([0-9]\{1,4\}\)"\(.*\)$/\1,\2\3,\1\2/'

It looks a complete mess with all the back-slashes required to escape both parentheses and braces. At least the square brackets do not need the same treatment!

With my new snag out of the way (I’d not previously used cardinality), let’s move on to the next problem in the file – the dates are presented as “2017-07-15” and we need them as an 8 digit number. So we need to strip the quotes and the hyphens.

sed -e 's/"\([0-9]\{4\}\)-\([0-9]\{2\}\)-\([0-9]\{2\}\)"/\1\2\3/g'

This is a pretty simple example, but adds the ‘g’ modifier on the end to enable it to replace all matching dates in the line, not just the first.

Finally, there were a lot of empty fields in the file and because every field was quoted and some of these needed to be treated as numeric, I decided to remove all empty quotes. Which is to say, find any adjacent pair of double-quote characters and remove them. Again, this applies as many times as it can on the line.

sed -e 's/,""/,/g'

That was a really simple one. Once I had these working on the QShell command line, it was time to wrap them up for execution in my CL program, at which point I added the input file on the front by issuing a cat command and piping it into the multi-action sed command, and finally redirecting that output to my result file. Note the doubling up of the single quotes in the sed command to allow for the overall quoting of the CMD( ) parameter string.

STRQSH CMD('cat /tmp/inputfile.csv | +
sed -e ''s/^"\([0-9]\{1,3\}\)","\([0-9]\{1,4\}\)"\(.*\)$/\1,\2\3,\1\2/'' +
-e ''s/"\([0-9]\{4\}\)-\([0-9]\{2\}\)-\([0-9]\{2\}\)"/\1\2\3/g'' +
-e ''s/,""/,/g'' > /tmp/outputfile.csv')

And there you have it! Simple!

Cleaning up macOS Launchpad

It is a much maligned feature of macOS but there are times when LaunchPad is useful. Such as when I want to scan my apps for recent additions that warrant a review. Sure, I can just open the Applications folder, but LaunchPad is a much easier presentation.

But, it does have its issues: for one, the inability to delete non-Mac App Store apps. This is a quick post on a method I found that allows you to do just that.

When searching for a solution, there were numerous older posts online that talked about issuing database deletes but many noted this didn’t work in more recent releases of the OS and indeed I found this to be true of Sierra.

Then I found this post at OS X Daily which offered what appeared at first to be a simple rearrangement of icons rather than removal of specific ones. However it turns out it does exactly what I needed. If you delete an app from your Mac and then follow this tip, the icon will be removed from LaunchPad.

So, here’s the sequence of steps required.

  1. Delete the app. If you have AppZapper or AppDelete, use one of those. If you have Hazel, simply drag the app to the trash and let Hazel clean up the extras*.
  2. Empty the trash. The one time I didn’t do this, weird things happened.
  3. Execute the below command in Terminal. If your browser wraps the line, just copy it as one and paste it into Terminal and it will be fine.
defaults write com.apple.dock ResetLaunchPad -bool true; killall Dock

That’s it. You may notice when you next open LaunchPad that some of the icons are absent momentarily but they will appear before you have time to worry.

A side effect is that the first page will have the default layout of Apple apps and subsequent pages will have all the extra apps previously present (less those deleted) in alphabetical order.

The one time I didn’t empty the trash, most of the icons were drawn as dotted rectangles, hence my step 2.

*Hazel has a feature called App Sweep. I’m not sure if this is turned on by default. You can find the setting in the Trash tab of the Hazel preference pane.

Combine and conquer

A little over a week ago I wrote about my quest for software to ‘run’ my podcast production for The Sitting Duck Podcast. Specifically, some form of sound board software that would work well for multi-track recording into Logic Pro X (hereafter referred to simply as Logic).

Since then I’ve discovered two new pieces of software and a new way to approach the multi-track solution.

As much as Apple’s Mac App Store is derided by the tech press, I have found some very useful software in it over the years and I’m still in love with the simplicity of installation and re-installation at the click of a button and maybe entry of a password. The down sides are the poor search results and the general unavailability of a ‘try before you buy’ option. (Some vendors make their apps free and unlock full functionality with an in-app purchase.)

So last night I set about searching for a ‘sound board’ application in the App Store. I was surprised that only a dozen results appeared. I guess this is an indication of the smaller number of apps in the Mac App Store as compared to the iOS one.

Unsurprisingly, Ambrosia Software’s Soundboard appeared in the top spot. Also unsurprisingly, many of the others are clearly not what I’m after – Burp And Fart Piano, anyone? But there are two in the list that warrant further investigation.

SoundBoard FX

This is a fairly straightforward sound board implementation There are a useful amount of tweaks that can be made to each sound and the run-time interface is nice and clean. It also has a pop-out clock and countdown (on the current clip) timer. I was impressed at first glance.

However, it does not have the ability to route the audio differently for each clip.


This app looks a lot more ‘professional,’ as its name suggests. Like QLab, it supports more than just audio, allowing for video and images as well. These won’t be useful for my situation but are, I think, an indication of the level of work that has gone into the app.

Sticking with the audio capabilities, there are a decent number of tweaks that can be made to each clip, including routing each to a specific sound device. I emailed the developer asking whether it would be possible to specify the channels on the target device and he quickly responded saying no, not yet.

When I saw his reply, something clicked in my brain and I suddenly thought of a possible solution to the lack of channel mapping. Only Sound Byte and QLab had this ability (the latter at a significant cost!) so it was a bit of a killer to the cause if I had to have it. This morning I played with my idea and I can now say MiX16 PRO is my front runner.

Aggregate devices

I had wondered whether there was a way of playing songs from the command line that would give me enough control. I soon found the afplay command and also found that there did not seem to be an easy way to set the output device for this. However, I kept coming across all manner of posts on blogs and forums that mentioned aggregate devices.

I know what these are because I have played with them before. macOS’s built in app, Audio MIDI Setup, has the ability to aggregate multiple physical sound devices into a single virtual one. Premier audio app developer Rogue Amoeba have built on this capability with their excellent Loopback app which, you may recall, I listed in my toolbox.

The thought that struck me was this: Could I create a whole bunch of simple, two channel sound devices, to which I could direct the tracks – one to each – and then aggregate those into a mammoth single device which Logic could use as its source?

The short answer – yes!


I opened Loopback and began to set up the devices I thought I would need. I created 8 simple ‘pass through’ devices. These are very simple to create as they use all the default settings for a new device. Simply click the ⊕ button to add a device, give it a name, and you’re done. I created eight of them, but I could use many more. I called mine “Tracker01” etc to keep the name short. “Tracker” referring to the device’s role in accepting a track to feed the multiple-track device.

Next I created a device I called “Master” and to this I added all of my “Tracker” devices plus my microphone. By default, this would mix all of the incoming stereo signals into a single stereo output. The magic is in setting up manual channel mappings.

You can click on both of the images below to get a better look if you need to.

The “Master” device now has all of the audio needed for Logic in a single device (Logic requires a single device). I should note here that the Loopback interface for mapping the channels is a little finicky. The trick is to select the target channel in the bottom selection by clicking its number, then select the source device in the top section by clicking its name, and then you can drag a channel token from top to bottom. If you don’t follow these steps the channel tokens either refuse to be dragged or refuse to land on their target. Weird.


Having created my monster, I needed to test it. For this I chose to use Sound Byte. It has full device and channel mapping and it works in trial mode for a short time – enough to get my test done.

I edited the sample tracks I already had in Sound Byte for my previous testing by setting each of the four tracks to output to the “Tracker01” through “Tracker04” devices and the channels in each case to 1 and 2 (the defaults). The setting of the channels this way mimics what the non-channel-mapping software will do. I then opened up my Logic test project and modified it to use “Master” as its input device. The four song tracks kept their channel mappings (3&4, 5&6, 7&8, 9&10), which was perfect, plus I added in a microphone track using channels 1&2.

With all tracks set to record and to monitor, I fired up two songs simultaneously (not very pleasant) and the Mac’s speakers provided some low level input the microphone which I had turned on.


You can see in the image above that the meters in Logic show a signal coming from the relevant tracks (in green) in Sound Byte plus the low level into the microphone.

So now I have a solution that enables me to use any software that can at least route clips to a different device, even if it cannot route to specific channels.

I’m going to think on the whole situation a little while longer, but with this in place and the high quality, reasonable price, and responsive developer for MiX16 PRO, I think that will be the way I go.

Above and beyond

UPDATE: I’ve now added a follow-up post covering two new apps and a solution to channel routing.

There was a time when I used to record and publish a podcast every Thursday. It only really lasted a few years before I lost the passion, but it does still exist and I release new episodes from time to time when the bug bites me.

I’ve been skirting around the edges of putting a new episode out for the last couple of weeks but one thing is holding me back – software. The podcast was born on a Windows PC in the wonderfully simple CastBlaster software. After switching to a Mac, I used GarageBand for a time before discovering what would almost be the perfect tool for me – Ubercaster.

Unfortunately, Ubercaster is no more, so this week I set out to see if I could come close to the Ubercaster experience with modern Mac software.


Before beginning my search, I stopped to consider what features made Ubercaster the uber solution for The Sitting Duck Podcast. You can get a pretty decent feel for all of the features from this contemporary review over at MyMac.com.

The first key feature was the single workspace. I would spend a bit of time in the Prepare layer, adding the songs I would be playing and perhaps some sound clips, and tweaking titles and layouts so I would have a good visual flow during recording. I could also adjust volumes for consistency and trim out silent starts and ends on clips.

Then came the Record layer. I would generally record an entire episode in one take, effortlessly clicking on songs to ‘fire’ them, clicking on the microphone mute so I could take a drink, and keeping an eye on time-to-run on the current song plus the total recording length, and checking levels. Importantly, for those odd times when things didn’t quite go right, I knew I always had the Edit layer so I would just push through any hiccups.

The killer feature of Ubercaster was the Edit layer. Even though the recording was live, the result was a multi-track layout almost exactly like GarageBand or Logic Pro. I could edit out vocal flubs if I was so inclined, though I rarely did. If a song hadn’t fired when I wanted, or there had been a few seconds of silence at the start that I hadn’t caught in preparation, I could simply slide things around to make it seem like everything had gone perfectly. Sometimes I even started the wrong song, stopped it, and started the correct one. That was still easy to fix in post even when vocal and music overlapped.

Since the disappearance of Ubercaster, I have gone back to using GarageBand or more recently Logic Pro X. Using these means either ‘wiring up’ multiple applications like SoundBoard and SoundFlower to input tracks of the recording software and then juggling applications during recording, or non-live production, where I record a vocal part, then stop and insert the song, then record vocal again. The former is a lot more work and easy to get confused with, and the latter doesn’t have the same feel to it, although it can save rather handsomely on the total time to record!

So my goals for a new approach would be:

  • Preparing all audio before hitting record, so it will just need to be ‘fired’ at the right time.
  • Minimising the number of applications I will need to interact with during recording.
  • Recording multi-track*

* When it comes to multi-track recording, there are different levels of ‘multi’ to consider. It’s relatively easy to get vocals on a different track to music, but separating music from jingles or multiple music tracks from each other is ideal. In the latter case, it allows for a cross-fade that can still be edited after recording as with Ubercaster.

The Toolkit

While I would be looking for new software, there are a few applications I already have which could be pressed into service as part of the solution.

  • Logic Pro X, from Apple, is the obvious end point for recording because it has all the multi-track tools I could imagine and then some.
  • Audio Hijack, from Rogue Amoeba, is also another option for recording which technically can record multiple tracks but is not an editor. It can play a role in managing audio channels, however.
  • Loopback, from Rogue Amoeba, is a lower level tool than Audio Hijack which can also assist in routing audio channels.
  • SoundBoard, from Ambrosia Software, is outwardly an obvious choice for firing songs and jingles.
  • I have various others such as SoundSource, Fission, and Piezo, but I see these as unlikely contributors.

The Contenders

I’ve searched a number of times for an all-in-one solution approaching what Ubercaster had to offer and come up well short, so I have been looking mostly at applications to ‘run the show’ for recording in Logic Pro. I’ve owned a copy of SoundBoard for a while and had tried using this for recording, so it became the benchmark from which I would cast a net for alternatives. I visited alternativeto.net and typed in “SoundBoard” to see what it had to offer.

Note that SoundBoard is still a contender, but it lacks the audio routing to be the perfect solution and the interface is not to my liking. Not to mention I have concerns about Ambrosia Software. The SoundBoard Remote application for iOS is on the list of apps that will stop working in iOS 11 because it’s still 32bit, and the pain of WireTap Studio still stings.


This USD$39 application from SIR Audio Tools is clearly aimed at live performance. It appears to have all the controls needed, including channel routing, but the interface is clearly designed around firing sounds by keyboard. While it does support MIDI (see below), the need for a clear interface at record time is not met because each sound clip is represented by a picture of a single keyboard key with limited information visible and no real options for layout. I have not trialled this.

Sound Byte

This USD$39* from Black Cat Systems is an interesting one. It has all the controls needed, including routing and layout (though it would be nice to be able to reduce the grid from 75 cells). If I were to judge it purely from the web site, I might run away (1990 called and wants its design back) but ticking all the boxes put this one on my trial list.

In use, the preparation of clips is cumbersome but relatively easy to understand. It would benefit considerably by allowing the user to set some aspects of multiple clips at once. With some time on setup, the record-time interface (which is the same interface) is functional and has some nice customisations available such as flashing clips nearing their end, disabling played clips, and locking volumes.

* The $39 price point gets the Lite version which would be sufficient for me. More expensive versions increase the number of racks available from the single rack in Lite.


This €39.50 application bills itself as “professional audio software” but I think I might need a qualification before I can use it. It has most of the controls necessary, but seems to be missing channel routing. At least, I think it’s missing. I couldn’t find anywhere to set a per-clip routing and when I did find something to do with output channels, I couldn’t figure out what it does. the interface is also confusing and I found it difficult to actually perform the basic actions for record-time. Not to mention adding clips is a nightmare. Drag and drop is not supported! I went looking for the help and it just says “coming soon.” Given they’re up to version 4, I’m not hopeful for that.


This USD$50 application from Marcel Blum takes the same approach as qwertyGO, using a facsimile of the computer keyboard as its interface. It has all of the control features, but the interface paradigm kept it off my trial list.

BZ Soundboard

This free application looks like a very simple one. A little too simple, however, as it lacks the routing features. Also, it was last updated in 2011. This did not make it to my trial list.


Although intriguing – this is an HTML5 sound cueing application – it will undoubtedly lack the required controls (being confined to a browser) and, well, any application whose homepage is a GitHub page isn’t going to be an easy time. Not on my trial list!


This application from Figure 53 seems to be the bees knees! It can cue audio, video, and lights and has plenty of features and a fantastic interface. It took me a little while to get to grips with the parts of the application, but this really is full featured and well thought out. There’s even a free version!

Just one problem. Audio routing is not available in the free version. That requires a license. A USD$399 license!

Outside The Box

After not finding the perfect solution from the above list, I began to wonder if there was a way to use what I already had to bolt something together.

The music I play lives mostly in iTunes. I’ve ‘Hijack‘ed iTunes into Logic before as a means of firing the songs, but it’s finicky, requiring a double click to launch the song from my playlist, followed by quickly clearing the Up Next list. Forget that second step and you get the next song when you don’t want it.

I looked at scripting iTunes to get better control. This is possible, but will take some effort to figure out, and it’s simply not possible to route iTunes audio, other than ‘Hijack‘ing it.

I looked at Audio Hijack to see if it had the ability to play files as sources, but it doesn’t.


Quite literally outside the (computer) box, adding some form of MIDI controller could make a difference to which software could work. For example, qwertyGO and Soundplant both lost out because of their interfaces, but if I could subjugate that interface with an external control surface, they may work well. I’m not sure how much a control surface might cost and I’m not sure whether dedicated controls will work versus a clearly labelled interface.

Best So Far

I’m still on this journey and as yet I’ve not outlaid any money, but so far the best option looks like Sound Byte. The interface is good enough and it has the channel routing that, together with Loopback, will allow me to lay down multiple songs into multiple tracks along with vocals and jingles each in theirs.

My next best options are to use SoundBoard or QLab (free) and deal with the fact that all the songs will lay down on the same track. An acceptable trade-off if I can commit to getting the cross-fades just right first time! 

Beats X Bluetooth earbuds – update

Two weeks ago I published a review of the BeatsX Bluetooth earbuds over at podfeet.com as part of my hosting duties standing in for Allison Sheridan. I had a few issues with them at the time I wrote the review but now I have a few additional thoughts on them, so here’s a quick update.

The issue I had with the power button still effectively remains, but with practice it’s really simple to judge “more than a second” to hold it down and I’ve found that in addition to easily forming the habit of turning them off an on, I’ve pretty much stopped looking at the power light now. The chime in my ears is enough to confirm to me I’ve done the job of turning them on or off – if I leave at least one bud in an ear before pushing the button, so I can hear it. This last bit can be mentally challenging. When I get into my car, I engage my after-market Bluetooth system. Once connected, it ‘steals’ the audio from the BeatsX and my first instinct is to simply remove them from my ears – but I need to leave them in and power them off first, even while the audio is playing now on the car speakers.

The bowing out or in of the cables below my ears remains a problem. Every length of wire on this product is a ribbon shape and this causes them to have a mind of their own. From time to time I still manage to twist an earbud – almost always the left – the wrong way and the wire twists uncomfortably into my face. If there was one thing I would change on the design, it would be round wires from the earbuds to the ‘lumps’. On the positive side, the recent weather has given rise to wearing my ‘up-to-the-chin,’ fleece-lined jacket and I’ve had no problems zipping this up to the top against the rain and the buds staying put, although see the following point for a probable contributing factor to that.

I mentioned I had trouble with wind catching the flat wires and tending to blow the buds out of my ears. There were a couple of days when the winds were severe enough (around 50km/h+) that this became extremely annoying. One lunch time I went for a walk and just about gave up trying to listen because the left bud kept edging out of my ear. I’d been experimenting with the different sized tips, even going with different sizes on each ear, but nothing would stay in. After that walk, I fetched out the optional “wings” that are provided and fitted what appears to be the smaller pair of the two provided. These have made all the difference. With the wings fitted, the buds don’t move at all. I’m fairly sure I still don’t have a perfect seal in my left ear, but even the strongest winds do not budge them. These also make the buds feel lighter in my ears.

There have been two days when I have absent-mindedly left my BeatsX at home. For such occasions I have two pairs of Sennheiser wired earbuds stashed in my car, as it’s when I get out at the railway station that I usually realise my error. I have the pair I had been using full time prior to the BeatsX, and an earlier pair which are a little higher quality but less practical. Using either pair of Sennheisers made a definite difference to my listening experience. The sound of both sets was somehow more comfortable sounding. I’m no sound engineer, so I can’t explain why that is. I also found the apparent weight in my ears was less with the wired sets, which surprised me. But… I struggled frequently with the wires, which was the whole point of going Bluetooth.

Just yesterday I discovered an unexpected benefit of the wireless earbud lifestyle. I had spent the last hour or so at work listening to some tunes while I got some work done – my ‘desk neighbours’ having left early. The iPhone was on my desk so I could easily see what track was playing and skip or like (I was listening to Apple Music’s My New Music Mix playlist). When it came time to pack up, I stood up and went about putting stuff away and even fetching my jacket from the nearby coat stand, all without having to fuss at all with where my phone was (it remained on the desk) or where any wires were dangling as I was reaching over and under my desk. When I was ready to leave, I simply placed the phone in my jacket inside pocket and walked out. The music had continued uninterrupted the entire time.

To summarise, the effect of wind on the flat wires is a bit of a design flaw that had me seriously thinking about giving them up, but the wings address that problem, if not truly fix it. With that out of the way, no other niggles make me regret my purchase. 

You can purchase BeatsX directly from Apple for NZD$229.95.

All images copyright © Apple, Inc.

Making pi

You know what the internet is like. You click on a link on Twitter which takes you to YouTube which suggests another video, which then suggests another and before you know it you’ve spent far more time than you intended at the computer.

That’s the process that landed me on a series of Numberphile videos. I know about Numberphile through another, much longer series of links. I listened to the NosillaCast, which featured Bart Busschots, who co-hosted the International Mac Podcast (no longer active), on which I appeared with Andrew J Clark who recommended The Prompt (since replaced by Connected) which featured Myke Hurley, who started Relay.FM on which he joined CGP Grey for Cortex, where Grey mentioned Hello Internet which features Brady Haran who makes the Numberphile videos!

Cutting to the chase, I watched a whole bunch of Numberphile videos today on all manner of topics including a number which has long held a fascination for me – pi, or π.

Many years ago, when I was in my early twenties, I was boasting to my father that I had memorised a bunch of digits of pi. I forget how many, but I suspect it was something like 15 or so. He promptly grabbed a piece of paper and slowly wrote out 30 decimal places of pi. The first ones matched mine so I had to assume he was correct with the rest. When I quizzed him on how he did it, he wrote out a poem in which the number of letters in each word corresponded with the decimal digits of pi. While trying (and failing) to find said poem online when writing this post, I discovered this technique is referred to a piphilology and specifically, my father relied on a piem.

In an effort to one-up my father, I took the 30 digits he had furnished and set about learning them by rote. I created for myself an extra login step on the computer terminal at work which required me to enter all 30 digits to continue, and I used this several times a day – though I had a much shorter cut-out passphrase for use when the boss was waiting on me!

To this day, I can still recall those digits, plus another 10 I committed to memory many years later. I swear to you that this is typed entirely from memory.


In fact, my Dad’s poem had a confusing word (I recall it had an apostrophe or similar) and for a long time I remembered the palindromic sub-sequence …46364… only later discovering it was correctly …46264…

After watching the pi-related videos today, I had a mind to get myself to a round 50 digits by committing the next 10 digits to memory. Now, while I could simply have looked up these digits online, I began to wonder, as I have before, whether I could use my Mac to calculate the digits.

A related video (not from Numberphile) that I had watched had a link to some software claiming to do exactly that, but on inspection it appeared not to have been updated in a while and was only provided for Windows and Linux. While I could probably have got the Linux one to work (perhaps in a VM), I began a search for a Mac program that could do it.

It turns out, there’s a remarkably simple way to calculate digits of pi on any Mac or Linux system without any software beyond what comes as standard. There’s a unix command bc which calculates with “arbitrary precision.” Give it the right equation and it’ll work with extraordinary numbers of digits.

This site I came across gives a remarkably short script to generate digits of pi to an output file. I’m not sure what that a( ) function is (it is intrinsically hard to search for!), but I ran it for 300 digits and it finished instantly. Then I ran it for 10,000 digits and it finished in 100 seconds. Before I went to bed I left it calculating 100,000 digits. It took – no kidding – 11 hours and 1 second.

While the big one was still running I had a file with 10,000 digits of pi – what to do with it? I’d recently been fiddling with shapes in Affinity Designer, trying to come up with some kind of new wallpaper for my 27″ iMac. I would create a new wallpaper, which I could then use to help me learn those next 10 digits, and maybe more.

So what, exactly, did I have? I had a text file which contained 10,000 digits of pi arranged in lines of 68 digits, terminated by a backslash and newline. I figured I needed to combine pairs of lines to get the right sort of shape for fitting a lot of digits on a 16:10 screen. I turned to the Atom text editor and its regular expression search and replace.

Find   : (\d{68})\\\n(\d{68})\\
Replace: $1$2

Now I had half as many lines of 136 characters and no superfluous backslashes. I copied and pasted the lot into a text block in Affinity Designer and chose a suitable font – monospaced, of course – which was Menlo. With a suitable font size to allow the digits to be read, but not enormous, I then trimmed to 42 lines to fit the screen with some space top and bottom. That’s 5,710 decimal places (plus the “3.”).

For a bit of style, I added a black background and ran a subtle grey ‘shimmer’ gradient from corner to corner. I think it looks pretty snazzy.

5,710 decimal places of pi on my desktop.
5,710 decimal places of pi on my desktop.

But then I decided I wanted something a bit more… funky. One of the Numberphile videos included a number of clever and artsy representations of pi using various visual techniques including the use of colour. What if I coloured all of the 0s one colour, all of the 1s another colour, etc?

I set about choosing the colours. It quickly dawned on me that picking evenly spaced colours along the hue axis of the hue-saturation-luminance picker would be a good choice. I trialled one of each digit and it looked OK. But how to do roughly 571 of each without going completely batty?

I hit upon a relatively simple technique using a combination of Pages and Atom. In Pages I created a new document with a single line of text “0123456789” and I coloured each of the digits appropriately. I then saved the file as rich text.

Opening the rich text file in Atom, it was reasonably easy to see how each colour was applied to each character. At the top there was a definition of all of the colours and then for each character there was a sequence like the following: \cf2 \strokec2 0

The colours were numbered from 2 through 11 in the order I had defined them, so all I needed to do was replace each “0” with “\cf2 \stroke2 0” then each “1” with “\cf3 \strokec3 1” and so on. It struck me that doing a find and replace on each of the digits was going to be problematic considering replacing the 0s would introduce 2s (as part of the colour definition) so I first did a bunch of search and replaces to switch out 0 through 9 with A through J. Then I was able to replace “A” with “\cf2 strokec2 0” and so on.

Having done the two rounds of replacements, I had a huge wodge of text which I then simply copied and pasted into the rich text file in the appropriate place. A quick preview showed it had worked! You might notice an opening extra “0”which is there because the first digit in the original file was prefixed with a bunch of other codes and so I left it there in case their order mattered. I later edited it out.



That looked pretty ugly on a white background, but when I copied it into the Affinity Designer file it and set it in Menlo against black, it looked… bright! I reduced the opacity to 50% but it still didn’t look right. Time to add a shimmer! I used the transparency tool to create a transparency gradient that varied between 100% and 75%. It was looking better, but the random distribution of the digits still gave an overall flat appearance. What was needed was some kind of hero feature.

I quickly hit upon the idea of using the π symbol itself as a feature. Many fonts’ glyphs for π are rather dull and square but I eventually settled on the AppleMyunjo font which has a pleasingly stylish one. I added a giant π in 50% grey, set the blend mode to colour dodge so it would brighten up the colours below it, lowered the opacity until it seemed about right (75%), then finally added a moderate gaussian blur to soften the edges.

Ta da!

So there you have it. 5,710 decimal places of pi, as art. I’m really pleased with the final version. You can click on the image above to see it larger, or click below to download the full 2880 x 1620 pixel version I use on my iMac.

pimono.png  picolour.png

The best camera – update

Back in 2013 I wrote a blog post (since taken offline) about my disagreement that modern smartphone cameras “make compact cameras obsolete.” My premise being that for many types of photo – just about anything of an object out of reach – the lack of optical zoom is a severely limiting factor.

Later I purchased what I call “the hundred dollar camera” and have been carrying this in the bag I take to work every day, and sometimes – when I remember – in my pocket. My goal is to find and capture scenes that are simply impossible to capture on a phone, using a device that’s just as pocketable and super cheap.

On Friday morning, I was doing my usual walk down Wellington’s waterfront on a frankly gorgeous morning. The harbour was glassy and still – a state it doesn’t often achieve – and there were numerous people out enjoying it in vessels of different sizes.

SSV Robert C. Seamans is a 134-foot steel sailing brigantine operated by the Sea Education Association (SEA) for oceanographic research and sail training. She had been berthed at Wellington’s Queens Wharf the previous day, but on this particular morning she was underway. (As it turned out, merely to another berth around the corner.) The sight of this beautiful tall ship on the glassy water with a grey overcast above was stirring enough that I decided I needed to capture the scene. I reached for my hundred dollar camera.

SSV Robert C. Seamans
SSV Robert C. Seamans

This photo was at an equivalent focal length of 106mm – almost twice that possible with the latest technology in the iPhone 7 Plus. As shown above, it is a very slight crop, colour corrected, and with some noise removal applied, which really only seemed to affect the foliage on the hill (Mount Victoria) behind.

Viewed at full scale, the quality of the image is terrible, but it looks fantastic on my iPhone 6 Plus screen. Easily the equal of good photos taken on the phone itself. But of course, if taken with the iPhone, it would have to have been a major crop and the quality issues on iPhone photos would become apparent – certainly if taken with the 28mm equivalent standard lens.

So, you might get something approaching that quality with an iPhone 7 Plus. But you wouldn’t have a chance of getting this shot at 172mm equivalent.

SSV Robert C. Seamans
SSV Robert C. Seamans

That’s a hair over three times the focal length of the iPhone 7 Plus and still comfortably inside the optical zoom range of the hundred dollar camera. The same types of processing have been applied as above and once again, it looks fantastic on my iPhone screen. In fact, it looks pretty darned good on my computer screen, too, if not at full zoom.

An iPhone shot would show a boat in a harbour. This shot shows people on a boat. This is a perfect example of my characterisation of “objects you can’t touch” which the iPhone camera is simply incapable of capturing well.

I’m not giving up my DSLR any time soon, even though I concede it is a bulky item to carry. I have carried my DSLR on my commute on a number of occasions, but it’s a little too heavy and bulky to be a regular practice. Or is it? As I wrote that sentence, it occurred to me the biggest pain with carrying the DSLR is the size of it in my laptop bag which is not designed to carry it. With some thought, I may be able to solve that.

But aside from issues of bulk with a DSLR, this tiny camera, which I can carry in the same pocket as my iPhone 6 Plus at the same time, clearly out performs any model of iPhone for less money than you’ll spend upping the storage size on your next iPhone.

The “hundred dollar camera” is a Canon IXUS 160, which cost me NZD$110 in 2016.

Not realising potential

This is a follow-up to my previous post and came about due to a discussion I had on that post with a friend.

One of the basic issues I identified with the cut-and-paste situation was that the touch interface is having to deal with an “old school” model of text editing that came, in fact, from the days before the mouse. However, I came to realise there are things that a touch interface should be really good at but are still hamstrung with old ideas.

I’ll keep this post much shorter. Watch this video, and pay particular attention to the section on photos starting at the 02:34 mark. Scrub forward to that section if you wish. If you’ve seen it before I urge you to watch that section again.

OK, now open up the Photos app on your iPad – the one which has seen continuous improvements for the last 10 years. Which experience is it closer to – the original iPhoto app for Mac, or the demo in the video above?

It is clearly an iteration of the basic iPhoto design which debuted in 2002 and couldn’t even claim to be original back then. You get a grid of photos, some sorting options, some searching, and you can tap any photo to have it enlarge to full screen. The iOS app isn’t even as capable as the basic feature of Photos for Mac today. Try adding a keyword to a photo. You can’t.

Why don’t we have something much closer to the demo by now? In case you didn’t know or notice, the demo took place in 2006. The year before the iPhone launched. Four years before the iPad launched. Modern iPads can play fantastically complex and detailed real-time video games – why can’t I organise and edit my photos in a natural fashion?