Menu

Colin Devroe

Reverse Engineer. Blogger.

E11: Browsers, Surfaces, MacBook Pros, and Tesla roof

Danny and I have an early Sunday morning conversation about our browsers of choice (he likes Vivaldi), Microsoft and Apple’s announcements this week and the Tesla roof.

Download MP3

Touch Bar

I’ve been tweeting like crazy about the new MacBook Pros and how I’ve found the most recent updates underwhelming. But I couldn’t come up with a great way to describe how I felt about Touch Bar in a way that I wanted in my blog archives. Until I read this.

Michael Tsai:

I’m not crazy about Touch Bar, but it does seem potentially useful.

That’s exactly it for me. The Touch Bar does not excite me. But, I can see how it could be potentially useful.

Apple asked a bunch of people to fly to their campuses to show them a new version of the MacBook Pro that is, of course, lighter thinner and has less ports. And it has the Touch Bar. But it isn’t what I wanted from them. And, as you see from Tsai’s post, perhaps they don’t care. I don’t care about lighter or thinner. I care about performance, storage, reliability.

I am underwhelmed. No. It is worse than that. I’m disappointed.

I’m using a 2012 MacBook Pro. I’m ready to upgrade. But I can see no compelling reason to do so. Not for Touch Bar that’s for sure. After spending some time with the Surface Book this weekend I can now say I’m going to purchase one and see if I can make the switch. Microsoft’s software may not yet be up-to-snuff but they certainly have my attention and it appears I’m not alone.

Hey, umm, Siri?

I was happy this week to see that the topic of how far behind Siri is came up on many tech blogs. It is a topic I’ve thought, but not wrote, a lot about. In 2012 Siri was ahead on ability, but behind on speed. Earlier in 2016, prior to WWDC, I wrote a WWDC wish list and in it I wrote that I had hoped:

I hope Siri can do a lot more – I think we’re overdue on being able to say things like “Hey Siri, send the photo I just took to my wife.” Or “Hey Siri, open Spotify and play Jack White’s Blunderbuss.” Or “Hey Siri, find a note in Simplenote that I created on May 15th”. Or, even, “Hey Siri, show me all of the photos I’ve taken in Hawaii.”

Based on what I saw at WWDC I thought I was going to get some of these things. But I was wrong. There are a few things that Siri has improved in iOS 10 but overall it seems that it is falling further and further behind with every announcement from Apple’s competitors.

Even without any competition from other companies I still think Siri should be better than it is. The examples I gave above should already be possible. I’ve had some notes regarding a few queries that I thought Siri should be able to handle by now but she cannot and so I’d like to take one of those and add them to my wish list.

“Hey Siri, how long until it rains?”

This query would be huge for me personally. I use and open Dark Sky at least twice a day. Because I hike and kayak and go for a jog outdoors I like to know, as accurately as possible, when it will start raining. Do I have 30 minutes to get in a quick jog? Or do I have a few hours to go kayaking? I’d like to know and it’d be convenient for me if Siri could tell me.

Walt Mossberg:

For me, at least, and for many people I know, it’s been years. Siri’s huge promise has been shrunk to just making voice calls and sending messages to contacts, and maybe getting the weather, using voice commands.

Why are people only using it for these things? John Gruber explains:

The big problem Apple faces with Siri is that when people encounter these problems, they stop trying. It feels like you’re wasting your time, and makes you feel silly or even foolish for having tried.

Even if Siri has gotten better no one will know because they’ve already given up. I know in a lot of cases I have. And that may be a harder hurdle for Apple to jump over than simply improving Siri’s capabilities.

 

10MacApps over 10 years later

Ten and a half years ago I was asked by Zach Hale to jot down my 10 favorite Mac apps and then ask a few others to do the same. Wow, ten and a half years ago. Pre iPhone.

Now, with the Mac seemingly a second-class citizen both in hardware upgrades and app popularity, now may be the perfect time to bring this post back to light and see what my favorite Mac apps are today.

In no particular order:

A few things I take away from this. First, is that none of the apps listed from 10 years ago are still my favorite apps. Though I do miss many of them and would happily still use them if they were still being maintained. Second, is that a few of Apple’s own apps have made their way onto my list that weren’t there before; Calendar and Reminders.

This means that less of the apps on the list are indie apps – or, apps made by independent developers. Rocket, Fluid, and Tweetbot are the only indie apps on my list. I wish there were a ton more. I miss the days of the majority of the software I was running being built by small teams or one person.

Some of the tasks that I used to do a decade ago I still do today just with different apps. As an example I still subscribe to a lot of blogs but I use Feedly instead of NetNewswire. I also listen to music but I use Spotify instead of iTunes. And I edit code in a text editor but I use Atom instead of Textmate.

While the apps have changed, and some are orders of magnitude better than the apps I used then, the tasks really haven’t. The reasons why and how I use a Mac today are exactly the same today as they were then.

I’m going to tag three people that would be totally awesome if they took the time to do this exercise as I’m sure it would help to shed light on a number of Mac apps; Jim Dalrymple, John Gruber and Brent Simmons.

What Photos for OS X and iOS will be able to automatically detect in iOS 10

Alternate title: My hopes are low for object detection in the new Photos but I still have hope

Reddit user vista980622 dig some digital sleuthing and may have come up with the list of over 4,000 objects, memories, and facial expressions that Photos for iOS and OS X will be able to mine all on its own with Apple’s Advanced Computer Vision technology announced at WWDC. The user then wrote this about the landmark detection on Ev’s blog:

Additionally, you can search for various landmarks. For example, Photos can respond for search query of “Maho” (beach in Saint Martin), despite Photos is not programmed or trained to understand specific landmarks. Behind the scenes, Photos app first generates a generic categorization for the scene, “beach”, then searches through a built-in dictionary for all landmarks that has the name “beach” in its definition.

This is smart approach.

It reminds me of something Craig Federighi (Hair Force One to me) mentioned during John Gruber’s live Talk Show event during WWDC. There are a lot of ways to teach Apple’s Advanced Computer Vision system that do not need to involve sending your photos to them. They know what a beach or mountain or forest looks like. They have access to the location of the photo. And they have access to the world’s knowledge via the web. Combining those things they can find a huge amount of information in your photos that can be used to discover them without ever needing to look at the photos themselves.

In my wish list for Photos for OS X, iOS, and iCloud Photo Library I mentioned that I wanted to be able to search for objects. I wrote:

Sort of related to the auto-generated albums above, I’d love to be able to search for “red” or “lake” or “tree” and get results. Google is killing Apple at this. And it just makes so much sense. The more the application does for you the less classification you have to do manually. I tag my photos with things like “cat” or “ants” or “beetle” or “snake” because I want to be able to search for these things later. And adding my own layer of taxonomy on top of my library should always be an option … but for objects that are easily identifiable these days (like lakes or cats) it just makes sense.

It appears I’ll be getting that. I noticed a lot of object and animal specific terms in the list that vista980622 shared. One stood out; “arachnid”. I hope, and am pretty sure I will be able to, still search by “spider” though it isn’t listed. Which brings us to the discoverability of these types of searches. I hope Apple doesn’t only provide a search box but that they also suggest searches or create pseudo-albums for you.

For example, Google Photos creates albums (sort of) by simply giving you a way to find those objects in your library without searching for them. They aren’t albums so much as links to search results that look like albums. I hope Apple builds in a discovery mechanism too. And it’d be great if it were based on what I took photos of the most.

Looking through my Library it’d be easy to see that I take a lot of photos of lakes and rivers (kayaking), bees, barns, and buildings. I also visit a lot of wineries and breweries. It’d be nice if Apple simply had “pseudo albums” or saved searches at the ready for me for all of these things. And then they could throw in a few for fun like cats, pink, beach, panoselfie.

One tidbit about the assumed facial expression detection in Photos… They seem to be using this to create memories. Who wants memories of a bunch of angry people? So I’m guessing that if they want to make a bunch of happy memories for people they needed to go beyond just detecting the people in the photos but also what mood they were in.

OK, one more tidbit about face detection. I’m skeptical that this will be any good. But I hope I’m wrong.

Currently there are two kinds of face detection. The first involves determining that there is a face in the photo. You’ll see iOS’s camera app doing this live while you’re shooting. A yellow box will surround people or objects in a photo to get a good focal length to make sure your subject(s) are in focus. That is face detection that simply says “we think this is a face”. Then there is face detection that involves determining the actual person in the photo. Photos for OS X has this currently… though it needs to be improved a lot. Like, a real lot. Check out this example from this weekend:

Photos OS X Face Detection Error Fail

You can click the image to zoom in a bit.

On the left, my friend Matt (who has a face). On the right, a vending machine (which does not have a face).

Photos for OS X believes that the vending machine is a face. This is technology that Apple has been mucking around with for at least 8 years as it was debuted in iPhoto in 2009. It didn’t suggest any names for the face (it rarely does, which I’ve covered here), but it doesn’t even see Matt’s face.

I’m sure that Apple’s new Photos for iOS and OS X will be better than what we currently have but I’ll wait and see before I get excited. Because so far they’ve yet to be great at this and Google and Facebook kill them at it.

I’m anxious to play with iOS 10 and macOS Sierra. But not anxious enough to install the betas on my hardware. So I’ll be writing a lot more about this in the fall.

Random subtle updates to Apple software

Apple could not possibly cover every update to iOS, macOS, tvOS and watchOS in their Keynote. So as the nerds have been picking through the trash in and around San Francisco they’ve been able to dig up several subtle changes that are worth noting.

Here are a few of them that I’ve found via Twitter and scouring the blogosphere. Some of them were mentioned in passing in the Keynote as well but I thought I’d highlight those too.

  • For developers, Apple made a variant of the San Francisco font called SF Mono. Yes!
  • Apple Maps can now find gas stations and restaurants along your route of travel. This will be enormously helpful.
  • macOS gives a tabbed interface to nearly every application out of the box without the developer needing to make an update. This will end up being a bigger deal than it may seem.
  • Creating sticker packs for iMessage requires no code. So expect a lot of sticker packs.
  • iMessage allows for read receipts to be sent, or not, on a per conversation basis.
  • iOS 10 allows you to remove default Apple apps (like Tips, Stocks, etc) from your Home Screen. It doesn’t delete the app, however.
  • The News app now allows you to subscribe to specific publications rather than only selecting them as possible sources. RSS reader?
  • iOS 10’s Phone application can make Skype (and, presumably in the future, Slack, Google Hangouts, etc.) a first-class citizen and add their call lists, voice mails, etc. to the Phone app. Think of how Facetime works now. It is built into Phone.
  • Apple Pay on the web will make paying for things incredibly easy. It would be possible for the worst shopping experiences ever to become one or two taps.
  • iOS 10 lets you edit Live Photos and even stabilize them (like Google Motion Stills).
  • Universal Clipboard (macOS, iOS) will allow you to copy text, images, or video from one device and paste them on another. The engineering to make this happen must be amazing yet the feature is completely invisible.
  • Picture-in-Picture on macOS. Right now I use a bookmarklet in Safari to force a YouTube video to pop-up and be in its own window. This way I can continue working or using Safari. Picture-in-Picture will allow me to do this, even in full screen apps.
  • tvOS’s Single Sign-on feature makes a 5 step process (or more) and makes it zero steps. No more going to CBS News’ website and typing in a 4 digit code to get access.
  • AppleTV will now support 4 game controllers, not just 2.

I’m sure more and more smaller, subtle things will come to light as Apple puts the finishing touches on these releases for the Fall.

Addendum: Mac Rumors has a few good ones. Notably, the Wake Alarm and Flashlight intensity settings.

Second addendum:

A few new tidbits emerged overnight. Partially due to John Gruber’s live The Talk Show with Hair Force One and Phil. Namely:

  • iOS 10 will support shooting and editing in RAW
  • And though some Apple apps can be removed (see above) they can’t be independently updated and will not be available in the App Store. Which is sort of a pity because that could have meant faster update cycles for most used apps.
  • Apple still loves the Mac App Store. Yeah, we’ll see.
  • Safari 10, in macOS Sierra, will turn all plugins (Flash, Quicktime, etc.) off by default.

Third addendum:

  • If you use an external keyboard on iOS one of the keyboard shortcuts you may use is “CMD+Tab” which lets you cycle through the apps you currently have open. In iOS 10 the Home Screen is now an option on that.

Fourth addendum (honestly, I could keep going and going):

Fifth addendum:

  • In iOS 10 a “magnifier” can be turned on. Let the Sherlock Holmes jokes run amok.

If you notice anything else, send me a tweet or an email or something.

WWDC murdered my wish list

In a good way.

Yesterday I scrawled a few comments during the WWDC Keynote, and did 1 second reviews of the announcements on Snapchat, but I thought I’d jot down the tally of things I had hoped for against what was actually announced.

First, however, let me just say that the amount of work Apple showcased yesterday is just staggering. No doubt thousands and thousands of designers, engineers, operations, and more people made all of that possible. And watching the Keynote back this morning I’m left wondering if any of them have slept in the last year.

OK. Let’s start with the wish list:

  • “I hope Siri can do a lot more” – Yes.
  • Big changes to macOS – I think what I meant, and what we got, are different. However, what we got is pretty great. Some of the demos on the Apple web site are better than those that were shown on stage. And the “tabs everywhere” feature is bigger than it seems.
  • Displays? Nope. But I don’t really care, honestly.
  • Updates to tvOS? Yep. Big, big updates here. Looks like now is the time to buy an Apple TV.

So, pretty much everything I wished for.

Now, onto the things I wanted to see in Photos and iCloud Photo Library specifically. When I had written that post I separated the improvements that I hoped to see into two categories. I had a list for things I thought we’d definitely see and a list of things that I was skeptical that we’d see (due to the amount of work they are to do). Well, it turns out that I’ll be getting a lot of the things I thought were “pie in the sky” for Apple to release this year.

  • Facial recognition more liberal – Yes. I think we got this and much more with all of the Google Photos-like “advanced computer vision” features. I can’t wait to see how well this works. If it works as good or better than Google Photos it will be mind blowing to a lot of iOS users.
  • Sync Smart Albums – Unknown.
  • Auto-generated Albums (for things like water, etc.) – Yes.
  • Face tagging on iOS – Yes.
  • Map view – Yes.
  • Tagging on iOS – Unknown. (But I still hope)
  • Spotlight – Perhaps. But it appears I can search for photos with Siri. “Show me photos I took yesterday” So I’ll take it.

Many of the other things that I had hoped they’d improve are under-the-hood improvements so I’ll need to wait and see if I get them.

As I was watching yesterday’s WWDC Keynote I felt like Apple was reading my blog and simply checking off the boxes.

That was a strong WWDC Keynote. Huge leaps forward in software and services. A near impossible amount of work has been accomplished.

Well all of the Photos updates I wanted for iOS 10 look like they are happening. Wow.

macOS looks like a great update. Hair Force One continues to be Apple’s best presenter.

This tvOS update is also pretty great. Looks like it might be time to update.

This watchOS update is an incredible improvement. Apple must have a huge amount of resources on Watch.

MSFT’s E3 Xbox Keynote has had gallons and gallons of more blood in it than Apple’s WWDC Keynote will. Turning off E3 Keynote.

WWDC 2016 wish list

Wish lists have been swirling around these last few days and many of them are quite good. But none of them are mine. So here is my wish list, not my predictions, for what will be announced today at WWDC.

  • I hope Siri can do a lot more – I think we’re overdue on being able to say things like “Hey Siri, send the photo I just took to my wife.” Or “Hey Siri, open Spotify and play Jack White’s Blunderbuss.” Or “Hey Siri, find a note in Simplenote that I created on May 15th”. Or, even, “Hey Siri, show me all of the photos I’ve taken in Hawaii.”
  • I already wrote my list for Photos and iCloud Photo Library. Any updates to these apps and services whatsoever would be nice since I use them so much. One thing I left out of that post; slightly better video editing. I don’t need iMovie. But a few more video editing tools built into the Photos app would be nice.
  • I’m hoping for big, big changes in the new macOS (formerly Mac OS X). macOS is long overdue for a big update. I wouldn’t mind, even, if it sort of felt like a do-over.
  • My only wish for the much rumored Apple Displays is that they are affordable. But I won’t hold my breath.
  • Lastly, if there are some nice updates to tvOS I’ll pick up the new Apple TV.

Anything other than the above will be icing on the cake. I love Keynote days.

App Store Subscriptions

Yesterday the news hit of Apple’s changes to App Store policies and features including allowing developers to leverage Subscriptions for their applications so that they can better make a living making great apps.

This, from John Gruber’s coverage at Daring Fireball:

Now, subscription-based pricing will be an option for anysort of app, including productivity apps and games. This is an entirely new business model for app developers — one that I think will make indie app development far more sustainable.

Some of you reading this may wonder why this is important and some of the coverage doesn’t really lay it out.

As it stands, for most apps in the store, you pay once and get upgrades for as long as the developer can afford to give them. Some applications, but not all, require constant maintenance. Perhaps they run a syncing service so that your information is available across all devices or platforms. Perhaps the services they are built on top of change a lot and so app updates are needed often to keep the app working. Or, perhaps they offer new content (like game levels, or editorials, or videos, etc.) and to support the creation of that content they need money.

All apps require updates a few times a year as iOS releases and new Apple devices are released.

The problem right now is… developers need money to continue coming in over time to build and update great apps. The “pay once, get updates for free forever” model, isn’t sustainable for apps that do not offer in-app purchases.

As a consumer of these apps (and you’d know this if you’ve read this blog for a long time) I want to pay for upgrades. When Tweetbot was released as a wholly different app to skirt around the limitations of App Store policy, I gladly ponied up. I use the app daily. I want it to continue working. So I will pay. There are other apps that I wish did the same thing.

I know there is a bit of confusion at the moment about exactly what apps are eligible for this or not. There is always confusion when a change like this is introduced. It’ll all shake it out. I’m very happy to see this change and look forward to supporting my favorite apps with my money. It means I’ll get to continue to use them.

Got Loop’d

It is always a pleasure to be linked to from The Loop. Yesterday, one of my posts about Photos for OS X and what I think can be done to improve it, was linked to (at my behest) by Dave Mark.

My link is in good company there. Be sure to check out the other links too.

This, in turn, spurred a few other links flowing into that post. This morning I saw this link from Benny Ling on AppleTalk from Australia. I love the internet.

Improving Photos for OS X and iOS and iCloud Photo Library

I’ll start out this post, as most empathic developers would, by saying that I realize how hard syncing is. It is incredibly hard to get right. The fact that it works at all is magic. It is amazing. And I’m tickled that I even have it.

That being said, we’re a few years away from men and women walking on the surface of Mars. So I guess we can expect a bit more from Apple’s photo experience.

For context, I have a library that weighs in at just about 230GB and is comprised of 67,000 photos and videos. And it is growing pretty quickly. I shoot with a variety of cameras but mostly I use my iPhone and GoPro. I also shoot a fair amount of video. I use Photos for OS X on my Mac which is backed up to an external USB drive. Photos for OS X is set to download originals from iCloud Photo Library (which I pay for the 1TB option). Photos for iOS is set on both my iPad and iPhone to sync to iCloud Photo Library but to “manage space” by not keeping the originals on the device.

Now that you know my setup, here are just a few of my suggestions to how Apple could improve Photos for OS X and iOS and iCloud Photo Library. Some of these I expect to see this year. Others, likely never. Let’s start off with some things I feel will likely improve.

Improve sync connection hogging

As I’ve griped about many times over the last two months while my photo library on my Mac synced to iCloud Photo Library — it kills my connection to the internet. I don’t mean to mix words. Let me be very clear. It doesn’t slow down my connection. It doesn’t make it a hassle to use the internet while this sync is happening … it kills it. The internet connection in my home is unusable by any other application or device while Photos for OS X syncs.

A huge improvement to the entire experience would be to stop this from happening. If you’re at Apple reading this I’d be more than happy to share any information about my current set up to help improve the process.

Facial recognition could be more liberal

The number of false positives I see when using Photos for OS X’s facial recognition are very, very low. Yet I still have to click click click click click click click to add Faces to photos. Even if I’m adding them en masse (it selects about 4 to 6 at a time and asks you “Is this Colin Devroe?” and you have to hit yes over and over and over).

One way to improve this would be to just allow more possible faces through. Rather than automatically tagging 10-15% (which it seems to now), auto-tag 50% or more. I’d be willing to bet that I wouldn’t need to go back and change many.

Or, and this is likely the easier solution, show 50 or 100 possible matches rather than so few. This way I can quickly scan them all and get on with my day.

Aside: There is a bug in adding Faces that is super frustrating but I’m sure they’ll lick it in an upcoming release. If a face isn’t detected by Photos you can add it yourself. You open the info panel, click “Add Face”. Pretty simple. However, more often than not when the circle appears that you are to drag and resize onto someone’s face, you can’t move it. It doesn’t always happen but it happens a lot. Far greater than 50% of the time. I have not figured out how to get around this bug.

Sync Smart Albums

I have a collection of Smart Albums for all sorts of things. One is to filter by camera model. This way I can see the photos I’ve taken with my SLR or my iPhone 5 or 6 or SE. Photos for iOS does not show the Smart Albums. It’d be nice it if did.

Sync metadata at the same time as the photos

My sync to iCloud Photo Library is nearly complete. I have about 67,000 or so photos and videos and my iPhone, this morning, is reporting just over 61,000. However, much of the metadata for the photos that have already synced haven’t yet made it across the chasm. I mentioned this in an earlier post; if I search kayaking I get far less results on iOS as I do on my Mac. Yet the photos from those potential results have already synced to iCloud Photo Library.

This results in a bit of frustration, which I can deal with, but I’m willing to bet that “normals” would think that search simply doesn’t work and wouldn’t know to wait until the entire sync is done.

Author’s note: I’m finishing the editing of this post several days later (after the sync has been complete) and the metadata does come over last. So search results are matching up. I really do think the metadata should transfer at the same time. Or, perhaps even before the entire library is synced.

Improve syncing new photos back to Mac

If I take a photo on my iPhone it shows up on my iPad through iCloud Photo Library fairly quickly. However, not a single photo has shown up on my Mac that I’ve taken since starting my months-long-sync to iCloud Photo Library. Perhaps they will when I’m finished this sync.

Author’s note: This has in fact happened. Now when I take a new photo it shows up on all devices, including my Mac, within a short period of time. It works beautifully. It would have been nice if this was happening on the Mac all along like it does on iOS. Why the difference?

Spotlight?

Maybe I just haven’t been able to find this… but Spotlight doesn’t search my Photos library on OS X or on iOS? This seems like something that has to be coming, right?

Now, onto wish list items. Things I wouldn’t hold my breath for but that I would love to see in an upcoming version of Photos for OS X and iOS.

Auto-generated Albums ala Google Photos

I know I’ve mentioned this before but the albums that Google Photos auto-generates are genius. And I know Google Photos is a cloud-based service and so they’re able to run all sorts of fancy algorithms against your library (whereas, presumably, the Mac app would kill someone’s computer figuring all of this out) but with iCloud Photo Library turned on could Photos on Mac and iOS show auto-generated albums for things like cats, lakes, rivers, sky, etc? Once I saw something like this I wanted it everywhere. If you haven’t tried Google Photos give it a whirl. It is pretty amazing.

Face tagging on iOS

Tagging faces would likely be even easier to do on iOS than on the Mac (for the photographer). I may even take a moment after shooting photos to tag my friends faces just to keep up with it rather than falling behind and having to wait until I get back to my computer.

Facebook has had this for years.

Photo editing and filter improvements

The current filters on both OS X and iOS are trash. Where Instagram goes for rather subtle or nostalgic edits, Apple’s filters just trash your photo. I do not know why I feel so strongly about this but, to me, they are terrible. And I hate dogging on people’s hard work.

That being said, the editing features are pretty good. I think one thing I’d add to both OS X and iOS is the ability to turn edits on and off quickly during the editing process. Have you ever tapped and held your finger on your photo in Instagram’s edit screen? You can see the original photo and compare specific sections while you edit. Say you’re bringing up the shadows to show a rock cliff a bit better, you can tap, hold, and see how much light you’ve pulled out of that area. The same thing can be done now on OS X and iOS by toggling off each section of the editor (3 or 4 taps rather than a single press and hold). It’d be nice if this was a single action.

Search by color or object

Sort of related to the auto-generated albums above, I’d love to be able to search for “red” or “lake” or “tree” and get results. Google is killing Apple at this. And it just makes so much sense. The more the application does for you the less classification you have to do manually. I tag my photos with things like “cat” or “ants” or “beetle” or “snake” because I want to be able to search for these things later. And adding my own layer of taxonomy on top of my library should always be an option … but for objects that are easily identifiable these days (like lakes or cats) it just makes sense.

Facial recognition in videos

I would have guessed we’d have this in 2016. I remember in 2008 or 2009 when I was working at Viddler I had come up with a conceptual way of pulling this off for our platform. We never fully implemented it. But I did take a swing. I still have the code.

It went something like this; every video has a certain number of keyframes in it. You can think of those keyframes as thumbnails. In fact, at Viddler we stored several of those thumbnails per video. Imagine tagging someone’s face in a video and using facial recognition on the rest of the keyframes just to mark where in that video the person was. (at the time, face.com’s API was still a thing, it could have been done for free).

The simplest, easiest solution for Photos would be to search through a few keyframes of the video, find some faces, and suggest some names. Even at that level it would allow for saying “Hey, Colin appears somewhere in this video.”

However, even deeper and more valuable, would be to know when someone appeared in a video. And this would be totally possible to do using machine tagging. E.g. “person:name=colin-devroe”, “person:appears=99.00” or “person:appears=99.00-110.00”. How cool would that be? “Hey siri, show me some clips of my friend Bryan from our camping trip in 2008.”

Tagging on iOS

I can tag photos on OS X with things like “kayaking” or “insects” so that I can find them later. And these search results appear on iOS. I’d love to be able to tag my photos on iOS too.

A Map view

Honestly, how isn’t this a thing? A single map view that shows where all of my photos were taken. Nearly every other photo service I’ve used has something like this. Flickr has had it since the dawn of man. It seems likely that this was a conscious omission by the Apple team. They must not find this sort of feature valuable because they have all of the pieces (Apple Maps is built into both iOS and OS X pretty deeply at this point).

Better Library exporting

Exporting from Photos is terrible. Apple’s history of photo library management, which is decades and decades of hard-learned lessons, should tell them that making the library exportable to some open standard is a huge win for customers. Apple’s mission over the last few years has been beaten into our brains … they care about us. They say they do. They are willing to fight the Supreme Court to protect the information I create with their devices… are they not willing to allow me to own that data in a way that I can use it anywhere on anything and move at any time?

Moving to Photos was painful. Moving away shouldn’t be.

I’m very interested to see what this year’s WWDC brings to this entire experience. Will every single interaction with the platform improve? Will Apple continue to invest in making this experience great? I really, really hope so. And if they do, I hope a few of the things I’ve mentioned here are addressed.

Overall though, now that my library is available on all devices, I’m happy with how it works. I can make do with what they’ve provided. It is well worth the money too. If you’re debating using iCloud Photo Library I highly recommend it.

 

Further iCloud Photo Library observations

On March 29th I began syncing to iCloud Photo Library using Photos on OS X. Today, over a month later, I’m just over halfway done. For context, you may want to read Photo stats and observations, and A few iCloud Photo Library observations.

As with those last two posts I’m going to provide a laundry list of observations that I’ve made. In no particular order:

  • Anecdotal evidence suggests that Apple throttles photo syncing. Telling Photos to sync takes a random amount of time to begin, suggesting that Apple has some queue in place. Also, I physically drove my external hard drive to a location with 4x the Internet connection that I have at home and I was able to upload roughly the same amount of data over the same period of time. Though, at that location the connection was still usable while syncing and at home it is not.
  • Library metadata is not kept up-to-date with every sync. For instance, I’ve begun tagging my photo library and on my Mac the keyword “kayaking” has hundreds more results than on my iOS devices even though those photos are already synced to iCloud Photo Library. I’m hoping that the metadata gets synced at the tail end.
  • During this month-long sync routine I’ve taken 445 photos/videos (not including Eliza’s hundreds of photos). As I take photos they are synced across my iOS devices but are not synced to Photos on Mac. So I have to manually import new photos/videos into Photos myself. I’m guessing this process will work (a new photo should show up everywhere automatically according to what I’ve read) once the entire library has been synced. In fact, Ben Brooks says it is fast.
  • Looking at my “Years” view on iOS I see a bunch of blank thumbnails unless I tap into each respective section over and over and over. Apple is likely trying to conserve as much space as they can by only loading thumbnails as you need them… but it is annoying. Tap tap tap tap.
  • There is no such thing as a photo on my phone anymore. Once my library has gotten to a specific size, I think, all photos are now going to iCloud Photo Library once I’m on wifi. So even recent shots need to “download” from the cloud. Since I typically post photos that I’ve taken within the last few weeks, it’d be nice if iOS kept 1,000 or so photos fully loaded.
  • Prior to syncing all of these photos to my iOS devices using Photos was lightning fast. Now, with  just about 45,000 photos/videos on all devices synced so far, everything to do with photos feels slower.

I’m looking forward to this process being over with. I have about 50GB still to go. On average I’m able to sync about 12GB a night. So perhaps in a week or two I’ll be completely done and I can really see how great this service will be.

One last observation: If I wasn’t a geek I wonder if I would ever go through this. My wife, as an example, can’t stand that our connection is down while this process is happening. I’m a little more understanding because, while I think Apple could prevent the issue, I understand it takes a lot of connection to sync so many photos. I’m willing to bet only the geekiest of the geekiest people would ever go through the relative pain I have to get this library synced. Google Photos, Flickr, and Picture Life didn’t have this issue.

Year one of the Apple Watch

In January I wrote:

The Apple Watch could be called a flop if it sold so poorly and customer demand or interest was so low that Apple totally shelved the project. But they haven’t. I’m willing to bet they made a lot of money on the Watch so far (far more than any of their competitors in the same space). And I’m willing to bet that in 2016-2017 Apple will double down on the Watch and make some incredible improvements to every piece of it.

Today Apple releases their quarterly earnings statement and while they won’t directly comment on the number of Watches sold — or how those numbers breakdown between the different styles of watch — analysts have backed into a figure that settles in around $6 billion dollars.

Flop? From now on I am classifying anyone willing to write the word flop into a headline about the Apple Watch as a clickbait artist, scammer, or moron. You’ve been warned. No matter how you frame it; a product earning $6 billion dollars (and I’d say it is likely that even with brand-new tooling to create these devices, and the R&D done that led up to it that the Watch was still profitable year 1) in its first year is not a flop. Of course people are making comparisons to Apple’s competitors in the smartwatch space and even in the traditional watch space (which I do not feel is a fair comparison). However, Apple smashed all of them in the revenue numbers game; see: Fitbit (by 3x), Rolex.

How much better could it possibly have done?

John Gruber takes a slightly more reasoned tone:

Apple Watch can’t be neatly summarized with a one-word description like “hit” or “flop”. It has some serious, deep flaws, but it has sold well — especially considering those flaws. And the people who own one tend to really like it.

I can agree with John — that the Apple Watch isn’t a runaway hit. And I don’t mean just sales. There are some design issues with the Watch and it certainly feels like a version 1. As did the iPhone.

Me, again, in a different post in January:

My wife has an Apple Watch. I’d call her a “light user” of the Watch. She wears it every day but mainly uses it for glancing at text messages. There are a myriad of other uses but, just like the original iPhone, they are a bit too slow to be fully useful yet. You can use them but you don’t very often because they are too slow.

Speed, is an issue on the Watch. But this is going to improve by several factors with each iteration.

Here is Gizmodo’s Casey Chan on something he doesn’t like about the Apple Watch; buttons:

First, I still don’t know what the buttons do. This is ridiculous (and probably very stupid on my part) because, well, there are only two buttons, the digital crown and the side button. Most of the times, pressing the digital crown acts like an iPhone home button. But sometimes it’s a back button (like when you’re in the Favorites contact screen). It gets more confusing because you can scroll through a list with the crown but you can never select, you have to tap the screen for that to work. Most of these things you eventually figure out, but these little inconsistencies just add to the frustration of using it.

I’ve only used the Apple Watch very sparingly as I don’t own one of my own. But I can agree. I’ve never been as confused using any Apple product as I was using the Watch the first time. I remember using the Mac for the first time and every single thing I wanted to accomplish turned out to be far easier than I thought it would be. The Watch needs to get to this point too. And if Apple sticks with it — and I think they will — then I think they will improve on it.

Even with these two main issues the customer satisfaction numbers are very high. Higher than first-gen iPhone or iPad. And, anecdotally, I’ve never talked to an Apple Watch owner that didn’t like theirs.

Rumors aside I’m sure the next Apple Watch will be faster, lighter, thinner, and hopefully a bit easier to grok. If they can do that, they’ll turn the Apple Watch into a massive hit by any comparison.