During its I/O Developers Conference on Tuesday, Google did a live demo of the next-generation of Google Assistant running on a Pixel phone. It’s a dramatic rethinking of how the Assistant interacts with you and your phone, down to its interface, which barely takes up any space on the screen. With a single “Hey Google” command, the presenter was able to execute 12 actions in 40 seconds, including launching and searching within third-party apps:
Set a timer for 10 minutes
What’s the weather today?
What about tomorrow?
Show me John Legend on Twitter
Get a Lyft ride to my hotel
Turn the flashlight on
Turn it off
Take a selfie
All that took just 40 seconds, but it’s not just the speed that’s impressive. Google Assistant only had to be summoned once, and it continued listening, understood the context of what was being asked, and didn’t require any, um, assistance. When asked to get a Lyft ride to “my hotel,” it knew where she was staying. It launched John Legend’s profile in the Twitter app. It even started a countdown after opening the front camera.
I tried the same thing with Siri and wasn’t nearly as successful. I had to press the button each time, since Siri doesn’t wait around after completing a task. (To be fair, Google Assistant doesn’t on phones yet either.) Siri was able to parse “it” to mean flashlight when I asked to turn it off, but Siri wasn’t able to launch a route on Lyft, and merely showed recent tweets about John Legend. I still had to press the shutter when the selfie cam came up. All in all, it still took more than 50 seconds to accomplish fewer tasks.
Google Assistant is so far ahead, I’m not sure Apple can ever catch up. The next-generation Assistant is due to launch in the fall on Pixel phones, and quite frankly, Siri isn’t even as good as thisgeneration. Where does Apple go from here?
Shrinking the cloud
To accomplish the tremendous speed boost, Google moved the entire AI model to the device itself, trimming the whole stack from 100GB down to 50MB. That means requests don’t need to be routed through Google’s servers, and Assistant can work even when your internet doesn’t. It’s not clear exactly how much you’ll be able to do without being online, but general tasks like checking your calendar appointments and turning on the flashlight can be done locally on your phone.
That might sound familiar. Apple has made a big deal about on-device processing and machine learning as part of its privacy push, and there has been persistent speculation that it is developing an offline Siri. To that end, Apple has been anonymizing and encrypting Siri data for years and using differential privacy to analyze data without collecting it. It’s been a line in the sand between Apple and Google that’s gotten thicker over time, but with the second-generation Assistant, Google is proving that it can both collect and protect your data.
Of course, Google still accesses far more data than Apple does—in fact, one of the key new Assistant features, Personal References, rifles through your messages and calendar entries to learn more about the people in your life—but Google is definitely rethinking how much needs to leave your phone. You can credit Apple with that shift in methodology, but Google appears to be doing a lot more. From the unobtrusive interface to the tremendous speed boost, Google understands what people want out of an assistant, and that includes keeping things as private as possible.
In an op-ed for the New York Times earlier this week, Google CEO Sundar Pichai outlined a new approach for AI called “federated learning,” which sounds an awful lot like Apple’s differential privacy: “We’re also working hard to challenge the assumption that products need more data to be more helpful. Data minimization is an important privacy principle for us, and we’re encouraged by advances developed by Google A.I. researchers called ‘federated learning.’ It allows Google’s products to work better for everyone without collecting raw data from your device.”
Skeptics will scoff at Google’s privacy assertions and claims here, but the fact of the matter is, Google is doing incredible things with a fully on-device and offline Assistant. No matter how many toggles they build into Android, privacy issues will always plague Google, and for many Apple fans, its efforts will never be enough. But Assistant is acting in ways Siri is still only thinking about. The fact of the matter is Google is taking the fight to Apple in a big way. Apple has been banging the privacy drum for a while, but it’s yet to show how it can keep data local and streamline Siri’s usefulness. With the next-gen Assistant, Google is doing both, leaving Siri to play catch-up.
How does Apple respond? With WWDC set to kick off in just a few weeks, all eyes will be on Siri and Apple’s new senior vice president of Machine Learning and AI Strategy John Giannandrea, who was plucked from Google last year. It’s been years since Apple outlined its stance on using data to bolster Siri’s smarts, and now it’s time to show us all what it can do. Because pretty soon, it might be too late.
Michael Simon has been covering Apple since the iPod was the iWalk. His obsession with technology goes back to his first PC—the IBM Thinkpad with the lift-up keyboard for swapping out the drive. He's still waiting for that to come back in style tbh.