Improving Photos for OS X and iOS and iCloud Photo Library

I’ll start out this post, as most empathic developers would, by saying that I realize how hard syncing is. It is incredibly hard to get right. The fact that it works at all is magic. It is amazing. And I’m tickled that I even have it.

That being said, we’re a few years away from men and women walking on the surface of Mars. So I guess we can expect a bit more from Apple’s photo experience.

For context, I have a library that weighs in at just about 230GB and is comprised of 67,000 photos and videos. And it is growing pretty quickly. I shoot with a variety of cameras but mostly I use my iPhone and GoPro. I also shoot a fair amount of video. I use Photos for OS X on my Mac which is backed up to an external USB drive. Photos for OS X is set to download originals from iCloud Photo Library (which I pay for the 1TB option). Photos for iOS is set on both my iPad and iPhone to sync to iCloud Photo Library but to “manage space” by not keeping the originals on the device.

Now that you know my setup, here are just a few of my suggestions to how Apple could improve Photos for OS X and iOS and iCloud Photo Library. Some of these I expect to see this year. Others, likely never. Let’s start off with some things I feel will likely improve.

Improve sync connection hogging

As I’ve griped about many times over the last two months while my photo library on my Mac synced to iCloud Photo Library — it kills my connection to the internet. I don’t mean to mix words. Let me be very clear. It doesn’t slow down my connection. It doesn’t make it a hassle to use the internet while this sync is happening … it kills it. The internet connection in my home is unusable by any other application or device while Photos for OS X syncs.

A huge improvement to the entire experience would be to stop this from happening. If you’re at Apple reading this I’d be more than happy to share any information about my current set up to help improve the process.

Facial recognition could be more liberal

The number of false positives I see when using Photos for OS X’s facial recognition are very, very low. Yet I still have to click click click click click click click to add Faces to photos. Even if I’m adding them en masse (it selects about 4 to 6 at a time and asks you “Is this Colin Devroe?” and you have to hit yes over and over and over).

One way to improve this would be to just allow more possible faces through. Rather than automatically tagging 10-15% (which it seems to now), auto-tag 50% or more. I’d be willing to bet that I wouldn’t need to go back and change many.

Or, and this is likely the easier solution, show 50 or 100 possible matches rather than so few. This way I can quickly scan them all and get on with my day.

Aside: There is a bug in adding Faces that is super frustrating but I’m sure they’ll lick it in an upcoming release. If a face isn’t detected by Photos you can add it yourself. You open the info panel, click “Add Face”. Pretty simple. However, more often than not when the circle appears that you are to drag and resize onto someone’s face, you can’t move it. It doesn’t always happen but it happens a lot. Far greater than 50% of the time. I have not figured out how to get around this bug.

Sync Smart Albums

I have a collection of Smart Albums for all sorts of things. One is to filter by camera model. This way I can see the photos I’ve taken with my SLR or my iPhone 5 or 6 or SE. Photos for iOS does not show the Smart Albums. It’d be nice it if did.

Sync metadata at the same time as the photos

My sync to iCloud Photo Library is nearly complete. I have about 67,000 or so photos and videos and my iPhone, this morning, is reporting just over 61,000. However, much of the metadata for the photos that have already synced haven’t yet made it across the chasm. I mentioned this in an earlier post; if I search kayaking I get far less results on iOS as I do on my Mac. Yet the photos from those potential results have already synced to iCloud Photo Library.

This results in a bit of frustration, which I can deal with, but I’m willing to bet that “normals” would think that search simply doesn’t work and wouldn’t know to wait until the entire sync is done.

Author’s note: I’m finishing the editing of this post several days later (after the sync has been complete) and the metadata does come over last. So search results are matching up. I really do think the metadata should transfer at the same time. Or, perhaps even before the entire library is synced.

Improve syncing new photos back to Mac

If I take a photo on my iPhone it shows up on my iPad through iCloud Photo Library fairly quickly. However, not a single photo has shown up on my Mac that I’ve taken since starting my months-long-sync to iCloud Photo Library. Perhaps they will when I’m finished this sync.

Author’s note: This has in fact happened. Now when I take a new photo it shows up on all devices, including my Mac, within a short period of time. It works beautifully. It would have been nice if this was happening on the Mac all along like it does on iOS. Why the difference?

Spotlight?

Maybe I just haven’t been able to find this… but Spotlight doesn’t search my Photos library on OS X or on iOS? This seems like something that has to be coming, right?

Now, onto wish list items. Things I wouldn’t hold my breath for but that I would love to see in an upcoming version of Photos for OS X and iOS.

Auto-generated Albums ala Google Photos

I know I’ve mentioned this before but the albums that Google Photos auto-generates are genius. And I know Google Photos is a cloud-based service and so they’re able to run all sorts of fancy algorithms against your library (whereas, presumably, the Mac app would kill someone’s computer figuring all of this out) but with iCloud Photo Library turned on could Photos on Mac and iOS show auto-generated albums for things like cats, lakes, rivers, sky, etc? Once I saw something like this I wanted it everywhere. If you haven’t tried Google Photos give it a whirl. It is pretty amazing.

Face tagging on iOS

Tagging faces would likely be even easier to do on iOS than on the Mac (for the photographer). I may even take a moment after shooting photos to tag my friends faces just to keep up with it rather than falling behind and having to wait until I get back to my computer.

Facebook has had this for years.

Photo editing and filter improvements

The current filters on both OS X and iOS are trash. Where Instagram goes for rather subtle or nostalgic edits, Apple’s filters just trash your photo. I do not know why I feel so strongly about this but, to me, they are terrible. And I hate dogging on people’s hard work.

That being said, the editing features are pretty good. I think one thing I’d add to both OS X and iOS is the ability to turn edits on and off quickly during the editing process. Have you ever tapped and held your finger on your photo in Instagram’s edit screen? You can see the original photo and compare specific sections while you edit. Say you’re bringing up the shadows to show a rock cliff a bit better, you can tap, hold, and see how much light you’ve pulled out of that area. The same thing can be done now on OS X and iOS by toggling off each section of the editor (3 or 4 taps rather than a single press and hold). It’d be nice if this was a single action.

Search by color or object

Sort of related to the auto-generated albums above, I’d love to be able to search for “red” or “lake” or “tree” and get results. Google is killing Apple at this. And it just makes so much sense. The more the application does for you the less classification you have to do manually. I tag my photos with things like “cat” or “ants” or “beetle” or “snake” because I want to be able to search for these things later. And adding my own layer of taxonomy on top of my library should always be an option … but for objects that are easily identifiable these days (like lakes or cats) it just makes sense.

Facial recognition in videos

I would have guessed we’d have this in 2016. I remember in 2008 or 2009 when I was working at Viddler I had come up with a conceptual way of pulling this off for our platform. We never fully implemented it. But I did take a swing. I still have the code.

It went something like this; every video has a certain number of keyframes in it. You can think of those keyframes as thumbnails. In fact, at Viddler we stored several of those thumbnails per video. Imagine tagging someone’s face in a video and using facial recognition on the rest of the keyframes just to mark where in that video the person was. (at the time, face.com’s API was still a thing, it could have been done for free).

The simplest, easiest solution for Photos would be to search through a few keyframes of the video, find some faces, and suggest some names. Even at that level it would allow for saying “Hey, Colin appears somewhere in this video.”

However, even deeper and more valuable, would be to know when someone appeared in a video. And this would be totally possible to do using machine tagging. E.g. “person:name=colin-devroe”, “person:appears=99.00” or “person:appears=99.00-110.00”. How cool would that be? “Hey siri, show me some clips of my friend Bryan from our camping trip in 2008.”

Tagging on iOS

I can tag photos on OS X with things like “kayaking” or “insects” so that I can find them later. And these search results appear on iOS. I’d love to be able to tag my photos on iOS too.

A Map view

Honestly, how isn’t this a thing? A single map view that shows where all of my photos were taken. Nearly every other photo service I’ve used has something like this. Flickr has had it since the dawn of man. It seems likely that this was a conscious omission by the Apple team. They must not find this sort of feature valuable because they have all of the pieces (Apple Maps is built into both iOS and OS X pretty deeply at this point).

Better Library exporting

Exporting from Photos is terrible. Apple’s history of photo library management, which is decades and decades of hard-learned lessons, should tell them that making the library exportable to some open standard is a huge win for customers. Apple’s mission over the last few years has been beaten into our brains … they care about us. They say they do. They are willing to fight the Supreme Court to protect the information I create with their devices… are they not willing to allow me to own that data in a way that I can use it anywhere on anything and move at any time?

Moving to Photos was painful. Moving away shouldn’t be.

I’m very interested to see what this year’s WWDC brings to this entire experience. Will every single interaction with the platform improve? Will Apple continue to invest in making this experience great? I really, really hope so. And if they do, I hope a few of the things I’ve mentioned here are addressed.

Overall though, now that my library is available on all devices, I’m happy with how it works. I can make do with what they’ve provided. It is well worth the money too. If you’re debating using iCloud Photo Library I highly recommend it.

Last updated:

Powered by Hubbub Pro+