Back in 2013, I wrote a blog post (since taken offline) about my disagreement that modern smartphone cameras “make compact cameras obsolete.” My premise being that for many types of photo – just about anything of an object out of reach – the lack of optical zoom is a severely limiting factor.
Later I purchased what I call “the hundred dollar camera” and have been carrying this in the bag I take to work every day, and sometimes – when I remember – in my pocket. My goal is to find and capture scenes that are simply impossible to capture on a phone, using a device that’s just as pocketable and super cheap.
On Friday morning, I was doing my usual walk down Wellington’s waterfront on a frankly gorgeous morning. The harbour was glassy and still – a state it doesn’t often achieve – and there were numerous people out enjoying it in vessels of different sizes.
SSV Robert C. Seamans is a 134-foot steel sailing brigantine operated by the Sea Education Association (SEA) for oceanographic research and sail training. She had been berthed at Wellington’s Queens Wharf the previous day, but on this particular morning, she was underway. (As it turned out, merely to another berth around the corner.) The sight of this beautiful tall ship on the glassy water with a grey overcast above was stirring enough that I decided I needed to capture the scene. I reached for my hundred dollar camera.
This photo was at an equivalent focal length of 106mm – almost twice that possible with the latest technology in the iPhone 7 Plus. As shown above, it is a very slight crop, colour corrected, and with some noise removal applied, which really only seemed to affect the foliage on the hill (Mount Victoria) behind.
Viewed at full scale, the quality of the image is terrible, but it looks fantastic on my iPhone 6 Plus screen. Easily the equal of good photos taken on the phone itself. But of course, if taken with the iPhone, it would have to have been a major crop and the quality issues on iPhone photos would become apparent – certainly if taken with the 28mm equivalent standard lens.
So, you might get something approaching that quality with an iPhone 7 Plus. But you wouldn’t have a chance of getting this shot at 172mm equivalent.
That’s a hair over three times the focal length of the iPhone 7 Plus and still comfortably inside the optical zoom range of the hundred dollar camera. The same types of processing have been applied as above and once again, it looks fantastic on my iPhone screen. In fact, it looks pretty darned good on my computer screen, too, if not at full zoom.
An iPhone shot would show a boat in a harbour. This shot shows people on a boat. This is a perfect example of my characterisation of “objects you can’t touch” which the iPhone camera is simply incapable of capturing well.
I’m not giving up my DSLR any time soon, even though I concede it is a bulky item to carry. I have carried my DSLR on my commute on a number of occasions, but it’s a little too heavy and bulky to be a regular practice. Or is it? As I wrote that sentence, it occurred to me the biggest pain with carrying the DSLR is the size of it in my laptop bag which is not designed to carry it. With some thought, I may be able to solve that.
But aside from issues of bulk with a DSLR, this tiny camera, which I can carry in the same pocket as my iPhone 6 Plus at the same time, clearly outperforms any model of iPhone for less money than you’ll spend upping the storage size on your next iPhone.
The “hundred dollar camera” is a Canon IXUS 160, which cost me NZD$110 in 2016.
This is a follow-up to my previous post and came about due to a discussion I had on that post with a friend.
One of the basic issues I identified with the cut-and-paste situation was that the touch interface is having to deal with an “old school” model of text editing that came, in fact, from the days before the mouse. However, I came to realise there are things that a touch interface should be really good at but are still hamstrung with old ideas.
I’ll keep this post much shorter. Watch this video, and pay particular attention to the section on photos starting at the 02:34 mark. Scrub forward to that section if you wish. If you’ve seen it before I urge you to watch that section again.
OK, now open up the Photos app on your iPad – the one which has seen continuous improvements for the last 10 years. Which experience is it closer to – the original iPhoto app for Mac, or the demo in the video above?
It is clearly an iteration of the basic iPhoto design which debuted in 2002 and couldn’t even claim to be original back then. You get a grid of photos, some sorting options, some searching, and you can tap any photo to have it enlarge to full screen. The iOS app isn’t even as capable as the basic feature of Photos for Mac today. Try adding a keyword to a photo. You can’t.
Why don’t we have something much closer to the demo by now? In case you didn’t know or notice, the demo took place in 2006. The year before the iPhone launched. Four years before the iPad launched. Modern iPads can play fantastically complex and detailed real-time video games – why can’t I organise and edit my photos in a natural fashion?
UPDATE: Please see the bottom of this post for an interesting side story.
There has been a recent resurgence of discussion in the Apple commentator’s world about the future of the Mac. In many cases, the discussion turns to how well, or not, iOS can take the place of macOS for many types of work.
I love my Mac and would hate to see it fade away. I’ve always had this feeling that some basic tasks are just more intuitive and simple on a Mac than on iOS but until a few days ago I couldn’t come up with any concrete examples.
I recently purchased a 9.7″ iPad Pro and have been using it for some writing – one of those tasks the commentators say an iPad is pretty darned good at. I wrote a fairly lengthy blog post for a friend’s blog using Ulysses, both on the iPad and my MacBook and iMac. Most of the initial writing was done on the iPad but editing occurred on the Macs. Again, it seemed like the easier option to edit on a Mac. But what was the truth of it? Here’s how I quantified the issue…
Having completed the blog post, I decided I should dig out a “to do” list I had created of future topics for my friend’s blog. It was a fairly old list and I found it sitting in Apple’s Notes app as a checklist. I decided it would be better to copy the items into OmniFocus so I could prioritise them, add notes, and mark off those completed.
OmniFocus on an iPad is a joy to use. It’s the type of app that really lends itself to a touch interface and I find it easier to use there than on my Macs unless I’m doing some major reorganisation. So I decided I would copy the 31 entries across on my iPad. Should be fun!
Here’s what it takes to copy list entries from a Note to an OmniFocus task. Hold on tight…
Multitasking makes everything easier. OmniFocus has been brought onto screen and has a new project ready for the new tasks.
Eagle-eyed readers may note the circles have disappeared from the Notes list in subsequent screen captures. That’s because I removed them after I discovered they would be included as “- [ ]” characters in the copied text. A quirk, but not relevant to my point here.
I have tapped the button to add a task to the project. Now what?
A tap on the Note places the cursor (not captured in the screen capture) in Notes. With care, the cursor is at the end of the text I want to use to create the task (after the “1” in “Item 1”).
I tap again to invoke the pop-up menu, from where I can enter text selection mode. I tap Select.
Initially, only the nearest word is selected, so I need to carefully tap-drag to select the whole line.
Now I have my line selected, I can tap Copy.
I have my line of text copied, but I’m still in Notes. I need to go over to OmniFocus with a tap over there.
The first tap sets OmniFocus as the active app, but I can’t paste anything yet.
Another tap brings up the Paste option, which I can then tap.
Boom! I have my text in the new task and I can save it. OmniFocus has the nifty “Save +” button which saves time by immediately opening a new task for entry.
At this point, it has taken 8 taps to copy a line of text from one application to the other. Several of those need to be made with some precision or additional taps will be required. This does not include those taps required to create and save the task in OmniFocus.
I repeated the steps to copy the second task across. But wait! Is there a better way? Does iOS offer better mechanisms to solve this simple task?
Well, there are action extensions, and OmniFocus most certainly has one. Let’s add the third item in the “modern” way.
First up, the same tap, tap, tap, drag is required to select the line of text in Notes.
But instead of going straight to the Copy function, we need to tap the arrow to get to action extensions.
Now we can tap the Share… button. We’re up to 6 taps now.
I had to scroll to find OmniFocus, but I could rearrange those to make it instantly accessible. So just counting the tap on the OmniFocus extension, we’re up to 7 taps.
The OmniFocus extension pops up, but because I’m “coming through a different door” the context of my project is lost and therefore I need to select it. Again, some organisation could put the project at the top, so we’ll give that as a freebie, but the tap to select the project takes us to 8 taps total.
A final, ninth tap on Save creates the task in OmniFocus. The modern approach takes more taps than the old school.
I use action extensions reasonably often and find them mostly intuitive and simple and quick. But when it comes to a repetitive task like this, all of those attributes melt away. Even the intuitiveness! When copying 31 tasks for my real list, my brain would start to get muddled on which step needed to be performed next. This is true on the Mac as well, sometimes, but this is a very, very simple task – copy some text from one application to another.
This same task on the Mac is far simpler. Again, with both apps open side by side and the project container and new task created, it takes the following steps:
Click on Notes. Drag over the text. Cmd-C. Click on OmniFocus. Cmd-V.
While that’s still 5 operations, only one of those requires any dexterity – the drag. More of it can be accomplished with keyboard shortcuts, too. Cmd-Tab to switch between the apps and, outside of my counting scope, Cmd-N to create a new task in OmniFocus and Enter to save it. I could even select the text in Notes with the keyboard although I reckon that’s slower and more fiddly.
Granted I could add a keyboard to my iPad, but should I require an expensive additional extra just to do a simple copy and paste task? I have no idea how different it would be with a keyboard, but I suspect that there’d still be a lot of touching the screen. The use of a keyboard on the Mac, plus the basic keyboard shortcuts (Cmd-N, C, V) are intrinsic to almost all apps because the keyboard is always present.
The nub of this issue, as I see it, is that a touch interface will never be good at detailed work that follows the same paradigms as the traditional desktop computer. Perhaps there is a clever way to multi-touch edit text that has yet to be thought of, but it’s not here now.
A final note. Those eagle-eyed readers may also note the time in my screen captures is out of order. There were so many individual taps that I found it hard to remember to take every screen capture the first time through. And the second time.
In this post, I used a specific task between Apple’s Notes and OmniFocus to illustrate a fairly basic concept. This was never intended to be a slight on either product, but rather the nature of iOS.
However, even though I did not reach out to the Omni Group, nor even complain about their OmniFocus product, the CEO of Omni Group, Ken Case, obviously came across the post and reached out to me to explain that I could have done this particular task more easily. This is a fantastic level of support! And so I thought it deserved a callout here.
@zkarj Sorry it’s not more discoverable, but you can paste multiple lines of text into OmniFocus for iOS to create multiple tasks at once.
Postscript: After publishing this post I noted that where I had used image captions to describe the steps, the text was too small in comparison to the few passages in regular paragraphs. In the space of a couple of minutes, I edited the post to move all of the caption text into text blocks, including creating most of those text blocks. One hand on the keyboard, one on the mouse, and my eyes planted firmly on the screen this was a quick and fluid task. Ignoring the fact this task isn’t even possible on iOS, if it were, I don’t think I would have been done in two minutes!
Today marks 10 years since I switched to the Mac and I thought, like that day, it deserved a blog post to mark the occasion.
A screen capture from the “Wayback Machine” at archive.org shows my blog post as it originally appeared in 2007 on the Sitting Duck blog.
I still think it is one of the best decisions I ever made to (mostly) abandon Windows. For all the complaints I have had over the years about the Mac, I still get a regular view into the Windows world and, as my friend Allison says, it’s like being prodded in the ribs every 5 minutes.
It’s a fascinating discussion to have with people who remain with Windows and say “it’s fine.” Very few people defending Windows have ever spent much time really immersing themselves in the Mac operating system, yet a large proportion (these days a majority I reckon) of Mac users came from years of Windows use like me, or at least have been exposed to Windows in an office environment. In my experience, Windows defenders never actually defend their choice of OS but rather attack my choice, often by explaining how “nasty” Apple is in its ways.
I had a fairly level-headed discussion recently where my ‘opponent’ was actually trying to defend Windows, but I kept pointing out to him that all of his positive points amounted to “it’s not as bad as it used to be.” I think that’s the nub of the issue. People just expect things to be difficult. That’s not to say things are always easy on the Mac, far from it. Yet I use Windows 7 five days a week at work and it is constantly bugging me in so many ways.
I’m not going to make this a long diatribe and try to convince anyone to switch. Truth be told, most who will read this will already be Mac users. No, I just wanted to mark the occasion and note that I’m still happy with the decision – 8 OS versions and 3 Macs later.
UPDATE: The export slices in the original file were a little bit off on dimensions and I wasn’t sure how to fix this at the time. I’ve now adjusted the Artboards and export slices to perfect those sizes, and this may solve the softness issue I described in the original post as well. The updated file can be found here.
I’m a recent convert to Affinity Photo, so when a special offer came up for Affinity Designer, I leapt at the chance. I’ve not done much more than mucking around to learn Designer, but while doing so I figured it’d be a great tool for designing the type of strong, bold icon I need for my (planned) iOS apps.
The template makes use of a new feature in Affinity Designer 1.5 – Symbols – along with artboards. The large (1024 pixel square) design on the left is based on a symbol which is then duplicated to the other artboards which are sized to provide the required icon variants.
Layers palette with master artboard expanded.
To modify with your own design, expand the master artboard and edit only the contents of the symbol, as marked with the solid orange border on the left. In order for the automatic scaling to work, you must remember to check the Scale with object checkboxes where ever-present, such as on strokes and effects. Also, keep an eye on where your objects go – if they don’t get an orange border, they’re not going to be replicated to the other sizes.
A number of guide layers are provided to help you edit and visualise your design.
Check the GUIDES layer to display the golden ratio guides on top of your design. You can also use these for snapping as required. This is not part of the symbol so will not affect the other icons sizes, however, remember to turn them off before exporting if you want your large icon to be clean.
Check the MASK layer to round off the corners with a white background to help visualise the final result. Once again, this will only affect the large icon, but there is another way to round the icons on all sizes. See below.
If the white corners are too much, there is a hairline outline of the corners in the Outline layer which may prove useful while designing.
The top level of the symbol (titled EDIT THE CONTENTS OF THIS) has an optional mask (titled Unselect for Xcode icons) which will round the corners on all of the icons. This may be useful for generating pleasing “finished” icons for promotional material, however, ensure you turn this off when exporting your icons for Xcode.
When you’re done designing, head on over to the Export Persona, where each artboard has already been set up for exporting in the relevant resolutions as required by Xcode or iTunes. The sizes and scale variations are current as at iOS 10.
I’m providing this free of charge because I think it might be useful for others, however, no warranty, express or implied, yadda yadda. You can contact me on my contact page if you have any feedback, but I don’t promise I’ll respond.
NOTE: I have noticed that the scaling Affinity Designer performs can lead to some soft edges and this includes the outside edges of the gradient background in the example – you may wish to check the output and consider whether you want to take a more direct approach to the scaling depending on your design.
In a recent episode of the Accidental Tech Podcast, John Siracusa said something that really resonated with me. In reference to the unfolding drama of Smile Software moving to a subscription model for their Text Expander product, someone had mentioned maybe their intent was “to move towards ‘enterprise.'” John’s response was along the lines of “I hope not, because moving towards enterprise means moving away from quality.”
I’ve been living with this fact for the last while at work and it can be soul destroying.
Events this week illustrated this phenomenon well. Earlier in the week, a colleague and I had decided we would install a new piece of software on a Windows 2012 server to see if it would be useful for us. The software in question was one component of a suite from none other than enterprise software powerhouse IBM.
The first hurdle came with downloading the software. After pecking away at arcane web pages which relied on buggy scripts that didn’t always do what expected, we were finally greeted with the download section we needed. We knew the downloads would be slow from past experience, so we carefully selected “Windows 2012 server” and “English” as filters to minimise the size. The total was 6.5GB comprised of one 5GB and one 1.5GB file. I started the larger one downloading just before 10:00 am. It completed downloading at about 3:20 pm! Fortunately, I guessed and was right, that the 1.5GB file could be downloaded in parallel without affecting the speed of the larger one. This is from one of the world’s largest IT companies and I work for a telecommunications company. I’ve downloaded other stuff at work at very decent speeds, so I have to assume IBM is throttling downloads to this ridiculous rate.
With our software downloaded, we set about installing it. After working through a few issues with our server to meet the pre-requisite requirements, I finally got the installer to run. Early on in the twenty six step install wizard, it ran a series of checks. It reported that we had not turned on the “8dot3” feature of Windows. Except we knew we had previously for another product. I eventually had a hunch we might have only turned it on for one of the two disk drives the server had and maybe it was checking both even though we had no intention of using the one on which it was turned off. I turned that on anyway, passed the check, and then moved onto the next step – which was to choose on which drive to install the software. If only those two steps were reversed and aware of each other!
After working through the twenty six steps, the actual install began and took about half an hour. The tool we want to use looks at a database and analyses the data within it, trying to ascertain the relationships within and the quality of the data. Because this is one component of a suite, and the suite is quite powerful, we had to install a complete web application server as part of the process. That application server has numerous configurations, user profiles, and more, hence the 26 step wizard. I think I entered details for half a dozen user profiles for various uses during the wizard’s execution. We had basically installed a full-blown “IT stack” in order to use one tool. That’s one of the key failings of enterprise software – it is generally considered “OK” to install a large mass of “stuff” to perform a simple task because “that’s what enterprise software does.”
The next day I set about trying to launch the tool we had wanted to try. Having told it to install only the one component we needed, I was bewildered by the array of a dozen shortcut icons it had placed on the desktop, none of which carried the name of the tool we sought! I found something called “launchpad” and that brought up a web page which included a link to our tool. Success!
Well, no. The tool appeared with a login screen. Which login was I supposed to use? I gave it my Windows credentials (which some of the other IBM software uses) but it complained. I referred back to my screen grabs of all those users I had defined in the wizard and guessed which one it might be wanting. On my second guess, I was rewarded with a successful login and… a blank screen with a few menu options at the top.
A poke around in the menus soon found “add” type functions. “Add a data source,” I said. That got me an empty screen with an option to “Add a connection.” Great, let’s do that. Err, how do I connect to a database when the only connection types available are clearly web-type connections? Time to head back to the documentation!
As has often been the case with IBM software, it is the documentation that makes we want to punch someone. After much hunting around through the table of contents, I located the section on adding connections. If you want to connect to a database, it said, you need to set it up in a different component of the suite… Gah!!
Back to the installer. At least this time it was fairly intelligent about the process. It guessed I wanted to add components and didn’t ask me many questions before installing the new component I selected. With the additional install complete… there were no additional icons on the desktop. Hmmmm. I fired up the launchpad. Nothing new there, either. Once again, I started trying many of the icons previously provided until I eventually found access to the new component. OK… time to set up that database connection!
I dived right in and fairly intuitively figured out where I could add what I needed. I created a new “thing,” specified it was a database connection and the specific type (IBM’s own DB2) and entered the obvious values. But there were a whole lot more values to be entered than I knew what to do with. Time for the documentation again.
When I found the documentation for “adding database connections” I threw up my hands, swore, and walked away from my desk. I have this cartoon pinned to the partition above my desk. I looked over at it.
I have reproduced the entire documentation for this complex action below.
Specify the required connection details in the connector stage editor (required details vary with the type of connector).
Click the Save button. The Data Connection dialog box appears with connection details already filled in.
Fill in the remaining details such as name and description, and folder to store the object in.
Click OK to close the Data Connection dialog box, and continue with your job design.
Steps 2 and 4 are “Click button.” Steps 1 and 3 say, essentially, “enter something.” The entire page for this complex task — which warranted its own table of contents entry — gave me nothing of any use. The worst part of this is that I was not actually that surprised because this is normal fare for much of IBM’s documentation. While working with another tool recently, I found a page in the documentation entitled “Pre-installation tasks” which contained (only) this sentence.
Before you install, you might need to prepare or configure your computer.
I’m not clear on whether documentation like this is merely lip service to a product manager, or whether someone honestly believes this is good enough. Either way, useless documentation for overly complex software is pretty much the norm from what I’ve seen this year. The really annoying thing about this is that I’ve seen the great side of what’s possible, and that’s from IBM, too. The vast majority of documentation on IBM’s “i OS” for Power Systems is of a high quality. I know this because I’ve been relying on it, initially in printed and latterly electronic form, for over 25 years. And the OS which it documents is without peer when it comes to quality and reliability. Unfortunately, it’s a platform that cannot sustain my career any more and I worry about my sanity if I have to move onto other platforms.
I follow the Apple ecosystem through the tech media and within that ecosystem, this type of software quality and documentation would be derided in the extreme. I know Apple and third-party vendors in the ecosystem are targeting individuals rather than “professionals” but there’s really no excuse for what amounts to shoddy, or perhaps lazy work that passes for “enterprise software.”
I’ve written a fair bit of software in my professional life, although nothing near as complex as the tools I have been working with this week. But my software, for all its simplicity, is written and documented for other people to use and I don’t consider it a success unless no-one ever has to ask me about how it works.
Perhaps that’s where the problem lies with enterprise software. From my experience, solving problems with it rarely seems to involve speaking with people who had anything to do with creating the software. They’re probably off creating the next monstrosity. With no comebacks, what reason is there to care about quality and usability. That’s why I want my TCP/IP punch.
Back in January, I wrote about returning to the fold of desktop computers with the purchase of my 27″ iMac with Retina 5K display. After four and a half years with a MacBook Pro (2011, 15″) I was not in love with the laptop concept.
In that post, I said I didn’t see myself returning to a laptop for my primary computer, which implied what I had and have been thinking – a laptop as a secondary computer was still appealing.
A week ago I decided to make that leap and what I found surprised me.
My personal circumstances were about to change and I faced splitting my time between two locations that would make it troublesome to decide where to keep my iMac. It’s not a terribly portable system! So I thought now might be the time to try out the “laptop as secondary” concept. Now… which model?
A friend recently bought a 13″ MacBook Air, which she loves, but the thought of a non-retina screen put me off what is otherwise an outstanding machine.
The new MacBook Pros are lighter and thinner than my 2011 model was, but not by much. Mine was 2.54kg and the new one is 2.25kg. I’d always found the weight of my MBP to be its limiting factor. It was uncomfortable to use on my lap for any length of time and virtually impossible to fit in my carry-on bag for business travel due to that weight and, to some extent, size.
So the Air was not modern enough and the Pro was too cumbersome. That left only the 12″ MacBook. This model has long fascinated me as I have listened to prominent podcasters try it out and fall into two distinct camps. Some fell in love with it, and some found it wanting. Was it the one for me?
On the surface, the technology is brand new, it has a retina screen, it is ridiculously light (0.96kg) and small enough to fit in bags with ease. But could it be underpowered, and would I cope with that keyboard and the lack of ports? There was only one way to find out.
I tried and failed to purchase it immediately before going on a 5-day business trip, but on the first day of my trip, I managed to sneak enough time to pop out to a high street shop and purchase one. I got the higher specced model in gold. My free time over the next couple of days was, shall we say, “messed up,” due to working some long and oddly timed shifts, but I managed to find time to go through setup, installing commonly used software and tweaking settings. Thank goodness for free wifi at the hotel and in the office.
On the evening of the third day, I actually had a decent amount of free time in the evening and so started really using it – for email (Gmail in Safari), Notes, OmniFocus, Photos, and various web reading and responding. I was doing much of this lying on the bed in the hotel room with the TV on. I’m never comfortable for long when sitting or lying down so I was shifting around quite a bit, but I noticed that no matter how I sat or lay, the MacBook was easy to shift around to somewhere comfortable.
I could lay it on my lap in classic laptop style, I could open it to its fullest extent and prop it up against my knees. For reading, I could even hold it by one side near the hinge and suspend it in the air at an angle. When I wanted to get up to grab a drink or snack, I could easily toss it aside onto the bed and maybe flip the lid closed. I realised after a while that the MacBook is pretty much as portable as an iPad.
Since then, I had two more nights in the hotel and now several nights at home, and even a couple of days off work where I’ve grabbed it during the day in much the same way I would previously have grabbed my iPad mini. The poor iPad mini has barely been touched. In fact, it hasn’t been on a charger since I left for my trip. That’s over a week now! I had preloaded the WWDC Keynote video, in HD, on the iPad and when I finally got around to watching in my hotel room on the last night, I ended up stopping it after 20 minutes and finding a low def stream I could run on the MacBook simply because I could easily prop it up or lay it on the bed to watch.
I’m still getting used to the keyboard, and I wish it was exactly like the Apple Magic Keyboard (which I love), but it’s not bad and so I get by. I’ve not yet noticed any sluggishness with tasks I have done on it. The screen is gorgeous. But most of all, its lightness and diminutive form factor are its best features. I still don’t think I’d be without a desktop for heavy lifting work (recording podcasts, image compositing, photo management and processing, and more) but as a secondary device, the MacBook shines.
One day, perhaps, the power of a MacBook Pro will be available in the same form factor, and I look forward to re-evaluating my stance at that time, but for now – the MacBook is what a laptop should be.
I’ve been hearing a fair bit lately about how amazing Google Photos is at intelligently identifying the subjects of photos. On numerous podcasts I’ve heard about people searching for “hugs” and “dogs” and “patios” and getting “astonishing” results. So I thought I’d give it a go.
I first installed the Google Photos app on my iPhone and gave it permission to back up my Photos library from there. It set to uploading immediately and I gave it plenty of leeway to get into the task, but soon realised that was a bad idea because I’d have to leave that one app open on my phone for a long time.
Next I installed the uploader app on my Mac and gave it permission (only) to the (same, iCloud) Photos library and left it to its devices. Within a day all of my 5,000+ photos were in the cloud. Next up was to try out this clever searching.
First up – “trains.” About 80 photos appeared including very obvious photos of trains, but also photos of parts of trains (e.g. windows) and the inside of trains. Pretty clever. I figure these are all iPhone photos and therefore geotagged and all of those internal shots would geolocate to a railway line. Fair enough. Seven shots are of railway lines but do not include trains and two shots are of trucks and cars, in a carpark miles from any railway. Not a bad score.
Next up is “cats.” This is most impressive. About 300 shots and every single one features at least one cat. That includes a photo of a picture of an ornamental cat and a tiny face peering through a window.
Third in the list is “aircraft.” One shot inside a train, one suburban sunset, one of an airport apron with no aircraft (but a fire engine), a marina, a bus, and a snowy garden pollute an otherwise impressively accurate selection of a few hundred. This includes aircraft interior shots and out-the-window shots as well.
My fourth and final test is “cars.” About 58 of 300 shots include some kind of vehicle I would accept is a “car,” although this includes trucks, aircraft tugs, and a large crane. What’s more impressive is the range of other subjects that were selected. Brace yourself!
Aircraft (many!), a harbour sunrise, other harbour vistas, boats and ships, architecture, a kitchen cupboard’s contents, a train ticket, bicycles, a coffee machine, a tangle of cords, trains (including a few that the “trains” search did not select), airports, yachts, a store candy display, a footpath sign advertising pies, birds, creepy sculptures, a busker, selfies, a cat, a half built gazebo, and a house! And it missed some fairly obvious car shots, too.
So a bit of a fail on that last one! I’ll admit the first three were quite impressive, but as many people have said of other services recently (notably Siri), if you can’t rely on it then it’s of considerably less value. My guess is they’ve taught the service how to recognise certain subjects, cars not amongst them.
But the one disturbing facet of the service, and the reason why I’m now removing all of my photos and the software, is that I have spotted a lot of photos which have been deleted from my Photos library. They do not appear in the Photos app, including in the Recently Deleted folder, and from memory, most haven’t been in my library in a long time. This gives me concern on two levels – first that Apple isn’t physically deleting the photos and second that Google are likely not using official APIs if they’re finding deleted photos!
Marco Arment wrote about it. Friends Scott Willsey, and Allison Sheridan have written about it. I’ve lost count of the number of podcasts I’ve listened to in the last while that have discussed it. Apple’s software is not living up to fairly basic expectations – that it do what it says on the box. Every time.
Mostly I’ve tended to think “yeah, they have a point, but it’s not so bad as I see it.” Today I’m writing this blog post because it is bad as I see it and it’s getting beyond being simply disappointing.
It was a somewhat cool, slightly rainy evening on my commute home from the city today. As I almost always do, I fired up Overcast on my iPhone, secured my earbuds, tucked the phone in an inside jacket pocket and set off from the office into the cold, darkening world outside. I had noted that I probably didn’t have enough remaining time of unlistened-to podcast to get me all the way home.
The train was moments away from pulling into my station when the podcast ended. I considered fishing out my phone to fire up the Music app but as the train was already slowing, I opted to wait. I got out of the train into a light drizzle and walked around the far side of the station building, both to delay my arrival at the regular mash of people entering the pedestrian over-bridge and to give me some free and quiet space to ask for Siri’s assistance. My experiences haven’t been great with Siri, so I thought I’d keep it simple.
[Pushes and holds remote button on headphone cord…] Beep! “Launch Music”
Boop! “Sorry I didn’t get that.”
[Pushes and holds remote button on headphone cord…] Beep! “Launch Music”
Boop! “Sorry I didn’t get that.”
Sigh. Siri was a wash, again. I have reasonable success with it inside, but very rarely when outside. I was at the human jam at the foot of the over-bridge when I pulled out my phone, in the light rain, unlocked it with Touch ID, pressed the home button to exit Overcast, and launched the Music app. As the app zoomed into view, I quickly jammed the phone back in its pocket as I could already see numerous tiny rain drops on the screen. I don’t like getting the phone wet!
As I walked up the ramp on the bridge, I pressed the (only) remote button on my headphone cord. Nothing. I pressed it again. Nothing. It should be noted at this point that during my earlier walk I had twice used that button to pause and resume the Overcast audio. In fact, I use it all the time when I go into shops with my earbuds in or on the train when an announcement is made. It works. Always. Except now.
I tried a third and a fourth time before I gave up, nearly at my car. I got into the car, extracted my earbuds, started the car, and then extracted the phone. There on the screen was the song I had been listening to earlier in the day, with the play icon beckoning. I can only assume that, for whatever reason, the remote was still trying to start playback for Overcast, which would technically, I guess, be the background audio at this point. EXCEPT… the MUSIC app is now in the foreground! Could my intentions be ANY clearer? I don’t think so.
I might have simply written this experience off as a glitch, except I have a hateful relationship with “currently playing audio” on my iPhone. Almost every night when I arrive in the house, the last thing my phone was doing was playing audio from Overcast. Every morning when I get into my car and hook it up to Bluetooth, guess what starts playing? Music! Every time. Every. Single. Time.
Here’s a newsflash: I almost never listen to any audio on my phone when I’m at home. Certainly not music. I think it’s pretty clear it’s not me when the music that gets played happens to be the first song on my iPhone. What’s the first song? It’s the first one in the list of “Songs” in the Music app. The list of songs that I almost never use. Clearly in an effort to always be playing music despite not having recently done so, it has to start somewhere.
So now I’m joining the chorus and calling on Apple to up its game. These are delightful, revolutionary products that can be a joy to use every day. How infuriating, then, that these silly glitches can so easily ruin the delight.
I am once again going to misquote Winston Churchill and state that iOS is the worst mobile operating system, except all of the others.