The tech media has long compared Google’s Pixel phone with the iPhone, despite the incredible disparity in consumer appeal. After all, the Pixel is the only other phone actually made by the company that controls its primary ecosystem. It’s the Android phone by the Android maker.
For the last couple of years watching the introduction of a new Pixel phone, it was easy to imagine an iPhone user looking at the camera features and results and thinking, “I wish my iPhone did that!” This year, while the Pixel 4’s camera capabilities might be better than the iPhone 11’s, Apple has at least caught up enough for it not to be the envy of an iPhone user’s eye.
This year, the thing that makes iPhone users say, “I wish my phone did that,” is Google Assistant. It’s past time for Apple to step up Siri in a big way.
Siri’s squandered lead
Siri was first released as an iPhone app in early 2010. Apple knows something groundbreaking when it sees it, and snapped up the company that originally created Siri, before Android and BlackBerry (remember BlackBerry?) versions could be released. A year later, it debuted as a beta feature of the iPhone 4s.
It proved wildly popular. So popular that the Siri back-end infrastructure couldn’t keep up with demand. No other phone had an assistant like Siri. Apple had a several-year head start on what would become a core feature of all smartphones and, eventually, smart home devices.
As it sometimes seems to do, Apple failed to recognize that its advantage was tenuous and must be vigorously defended. It didn’t invest nearly enough in its assistant technology, allowing Google—and some would say Amazon—to catch up and eventually pass it by. Now, Google Assistant on the Pixel 4 looks like the future, and Siri just feels like a more polished version of what we’ve been using for years.
We need a next-gen Siri, not just a better Siri
Apple has gotten serious about machine learning and its virtual assistant in the last couple of years, going on a huge hiring and acquisition spree to bolster its R&D efforts. But as a customer, I don’t feel like Siri is next-level. I feel like I’m fundamentally using the same Siri I have been for the last seven years.
Siri is dramatically better than it used to be, but it still works in essentially the same way, and does essentially the same things. Say “Hey, Siri” or press and hold the side/home button, and it takes over the entire screen, giving you hit-or-miss answers to certain classes of questions or performing carefully prescribed functions. It is an island unto itself, siloed into its own full-screen interface, and yet requires an internet connection (despite Apple’s stance on privacy and performing operations entirely on your iPhone).
Google’s demonstration of its new voice recorder feature that does real-time transcription was a dramatic display of its ability to understand speech, but more impressive is that it operated in airplane mode. In fact, many Google Assistant features will be run entirely on-device. This seems like the kind of thing Apple should have demonstrated when it overhauled the Voice Memos app in iOS 12, doesn’t it?
With all of Apple’s talk about privacy and security, why can’t Siri do on-device real-time voice transcription of our Voice Memos? Turn on airplane mode and you can’t even invoke Siri at all. You get a big fat error stating that you have to be connected to the internet.
Why? Why can’t I tell Siri to launch an app, or convert pounds into ounces, or roll dice, or tell me about any of the info that’s already on my phone (like calendar events or reminders)? Siri should only need to connect to the internet when the answer to a question needs has to come from there, like stock prices or sports scores. “Remind me when I get home to call Jon” should be able to set the proper reminder without any network connection. There’s no need to be online for, “Show me photos of mom,” or “Set an alarm for 7:30 tomorrow.”
Perhaps worse than its internet-connected requirement is the way Siri still feels like a separate entity, rather than a holistic part of everything I do on my iPhone.
Invoking Siri takes over the entire display. Why? In iOS 13, Apple made Siri a simple overlay along the bottom of the screen in CarPlay, but on your iPhone it still takes over your whole device. It’s visual distinction that sends a clear message—Siri isn’t a part of what you’re doing, it’s something you stop what you’re doing to use.
It’s also blissfully unaware of the context of what you’re doing at the time. I should be able to have any webpage open and ask Siri to, for example, “Translate this page into Spanish.” Or select a word in any app, in any language, and ask, “What does this word mean?” to get a definition. If I have the Calendar app open to a specific day, I should be able to tell Siri, “Make an event for 6 p.m. to get drinks with Susie,” and it will know by context to put it on that day of my calendar, rather than today.
Queries and commands to Siri should understand the context of anything on my screen. If I’m watching a movie trailer on YouTube, I should be able to say, “Buy tickets to this,” and get nearby movie ticket results for that particular film.
This works in very limited capacity today. For example, if I’m looking at an iMessage conversation with my wife, I can say, “Where is she?” and my phone will open to her results in Find My Friends, because I have her added there. A true next-gen Siri should seek to draw proper context from anything on my iPhone or iPad’s display, in addition to the ambient sound, location—the full suite of sensor data.
The Siri experience needs a dramatic overhaul
Google may not sell as many Pixel phones as Apple does iPhones. The future of Apple may be services, not just hardware. But Apple would be well-advised to look at Google’s latest phone and feel a sense of paranoia. No advantage sticks around forever, and no ecosystem has a moat too big to cross. Apple should make beating Google Assistant as big a priority as it must have been to beat the Pixel’s camera.
It will have to “skate where the puck is going,” as the saying goes, to envision Google Assistant several years out and architect a completely new Siri experience to beat it. Hopefully, Apple recognized its long-lost leadership position in this area years ago and has been architecting a next-generation Siri experience for a long time.
The iPhone of the future needs more than an incredible camera, 5G connectivity, and a super-fast processor. Apple keeps reminding us that machine learning is used in all aspects of the operating system. It’s time to tie that all intelligence together and surface it in a whole new Siri experience, instead of continuing to iteratively improve this iPhone 4s interaction model. It’s time for a whole new Siri experience that fully integrates with everything we do with our phones and runs entirely on-device wherever possible.
If Siri continues to evolve in the ways it has recently—adding a domain here or there, improving its voice, delivering slightly better results to specific types of queries—it will be left hopelessly in the dust by Google Assistant on Android phones and Alexa on, well, everything else. It’s already way behind, and pretty soon, consumers are going to really start to notice.
Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read ouraffiliate link policyfor more details.