Representatives of Google sauntered across the stage at the company’s I/O developers conference this week and revealed wonder after wonder. Google Assistant, its Siri competitor, now looks so advanced that we’ll probably be comfortable dropping the “science fiction” categorization from the movieHer in a few years. Cameras on Android phones will soon be able to select and copy text on printed books as easily as I selected and copied the text in this document on my Mac.
And all the while, one thought raced through my mind: It should be Apple.
We used to see these kinds of wonders from Apple more often. Lately, though, Apple seems frighteningly willing to surrender impressive leads in technologies, even when it was the first company to really show us that those technologies could be cool. We see that most clearly with Siri, which nowadays seems more readily used as the butt of jokes than a serious voice assistant such as we find from Google or Amazon. And after Google I/O, Apple seems poised to slip behind in augmented reality as well.
As with Siri, it’s hard to contemplate how or why this could happen with Apple’s resources and gargantuan cash pile. Worse, Apple shows with ARKit that it gets augmented reality, and it wasn’t a stretch to believe that great AR features would quickly follow from the Cupertino giant itself.
But they haven’t appeared. Apple CEO Tim Cook has been saying for months that augmented reality will “change everything.” It’s “big and profound,” he says, and it will “amplify the human experience.” I don’t think he’s wrong about that, but I’m puzzled that Apple seems content to let third-party developers show us how AR can transform that glorious tomorrow into a today. So far, what Apple itself has shown us—impressive though it may be—still looks like toys. We’ve seen AR games where you play Star Wars’ HoloChess. We’ve seen virtual frogs that you can “dissect” on your kitchen table through your phone without ever touching a scalpel.
Google, though, showed us how we can use augmented reality in our everyday lives. More importantly, it showed that we need nothing more than our phones to get the most out of it. In one of the most impressive demonstrations, Google showed how someone could open up Google Maps and then get reviews and related information about the various surrounding businesses by pointing the camera over their doorways. There was even a bit of playfulness reminiscent of the Apple of Bondi Blue iMacs in the form of a fox that showed you the right direction to head in. Elsewhere, Google showed improvements to its Lens technology that gave you information about various items in real time, whether it was actual, playable music from a particular band while you pointed your camera at a concert poster or information on where to buy shoes much like the ones on your friends’ feet.
Here, Google shows an affection and real understanding of the simplicity we usually associate with Apple. And Apple? It currently has nothing like this. Unless I’m greatly mistaken, we’ve seen absolutely zero practical AR features from the company itself (if you don’t count Animoji).
So what is Apple doing?
The most believable rumors all suggest that Apple is throwing its resources behind a combination AR/VR headset. Even if Apple manages to make such a thing look more “cool” than what we’ve seen from devices like the HTC Vive or the oft-ridiculed Google Glass, it still requires an extra, pricey thing you fit on your head. You have to be prepared to use it. Google, instead, shows us that augmented reality can augment our daily lives with just a phone. Watching someone check out reviews of the restaurants on a city street through Google Maps, an onlooker may think they’re simply taking a photo. Our phones already do almost everything else, so why not this?
It’s but another worrying sign that Apple somehow doesn’t understand how ordinary people use its devices in ordinary situations. I probably wouldn’t be so worried if it not for the spectacular disappointments of the HomePod, a fantastic speaker in spec terms, but laughably deficient on account of its dogged reliance on an outdated Siri and the walling off of Apple Music competitors. It’s as though no one ever tried it out other than the designers.
Weirdly, Apple seems to be taking the opposite approach with ARKit: Let the app makers discover how Apple’s tools can be used efficiently, and then Apple itself will presumably follow up with its own complementary applications. Again, we’ve seen next to nothing in native Apple programs, which is a strange approach for a company that won’t even let you customize the face of your Apple Watch. And mind you, this isn’t always a problem. In its best releases, Apple’s shunning of outside customization feels less like it’s being restrictive and more like it’s telling us “Let us show you how it’s done.” And sometimes, we’re in awe.
For all Cook’s uplifting talk, we’ve yet to see that from Apple. Apple’s current “let the developers handle it” approach is especially weird since accessing a third-party app with an Apple product almost always requires an extra step, and a huge number of iOS users are happy to use Apple’s onboard apps. And since third-party developers can’t change key features of iOS (such as the native camera app), we only see the novelty of augmented reality. The dissectible frogs; the HoloChess games. As with Siri, it’s hard to believe that Apple hasn’t dropped the ball.
One more thing
Here’s the good news: We’re less than a month out from Apple’s Worldwide Developers Conference. I’ll be happy if I look like a fool for writing this if Tim Cook walks on stage at that point and shows us new, homegrown AR features that make Google’s reveals look like a three-year-old’s attempted reproduction of a Rembrandt. Heck, I’ll be relieved.
And the technology is certainly there, or close to it. ARKit is already impressive, and it may be more so with an upcoming release. Some rumors already suggest that Apple is planning on using a version of the TrueDepth technology it employs for Face ID on a rear camera, allowing for better mapping of objects for AR within the camera’s field of view. For that matter, Apple Maps is due for a big update, and it’s probably coming soon since the cars for Apple’s presumed version of Google Street View have grown increasingly common. Within a month, Apple could be showing us that it’s thought of how to use augmented reality within iPhones in ways we’ve never dreamed of.
But what if it doesn’t? What if, as many rumors already suggest, iOS 12 turns out to merely be a big bug cleanup patch after the comparatively infested iOS 11? I can’t imagine Apple would let a year go by without some kind of impressive feature update, and more than that, I can’t believe that Apple would produce something like ARKit and let third-party developers have all the fun. That’s just not Apple’s way.
Something is almost certainly coming, but it needs to happen sooner than later. The way things are going, Apple is in dire need of augmenting its own understanding of the reality beyond its own borders.
Leif is a San Francisco-based tech journalist. He's a big fan of fantasy RPGs, and you can find his previous work on IGN, Rolling Stone, VICE, PC Gamer, Playboy, Mac|Life, TechRadar, and numerous other publications.