You might have missed the Apple references during last week’s Google Pixel 2 event, but they were there. There was VP of Product Management Mario Queiroz telling the crowd, “We don’t set aside better features for the larger devices.” Or pointing out that “even iMessages” would be transferred over when you decide to switch. And let’s not forget the woman taking a big bite out of an apple during the Pixel 2’s intro video.
In fact, it seemed like every new Google product released last week had a singular message seemingly aimed squarely at Apple. Google might have a long way to go before it starts selling iPhone-like numbers of Pixel phones, but there is one important area where it’s firmly in the lead, and it has nothing to do with bezels or beats. It’s about intelligence.
At its recent iPhone X event, Tim Cook pulled out Steve Jobs’s old Wayne Gretzky quote about skating to where the puck is going, but there’s only one company that’s thinking forward right now and it’s not Apple—it’s Google. New hardware might have been the reason for the event, but machine learning was the strongest undercurrent, and the message Google sent was clear: Our AI is better than your AI.
Brains before beauty
Google CEO Sundar Pichai kicked off the the Made by Google event by talking about his favorite topic: machine learning. It’s not enough that Google makes uncannily accurate search algorithms anymore. Google is using its AI smarts to make its products more responsive and adaptable to each user’s lifestyle.

Google Home Mini will put Google Assistant in every room of your house.
It’s not about specs (although the Pixel phones have really good ones) or design (although the Google Home Max will look good in any room). In fact, none of Google’s new products are all that interesting on the surface, but what’s inside is leaps and bounds ahead of what Apple is doing with Siri and iPhone X. It’s about smarts, and Google has integrated Google Assistant and machine learning into every one of its devices in a, dare I say it, Apple-like way.
There’s the impulse-buy Google Home Mini and the high-fidelity Max to put Assistant in every room of your house. Active Edge on the Pixel lets you squeeze the sides of your phone to launch Assistant. And the Assistant-powered Pixel Buds feature a remarkable live translation feature. Google’s latest products are designed from the inside out to be smarter than they are pretty, a big bet that consumers are tired of good-looking gadgets that put form before function. And if it’s right, Apple could be playing catch-up for years to come.
Conversations, not commands
Apple’s first home AI speaker won’t hit shelves until December, but Google already has three of them. When it was announced in June, HomePod appeared to have an advantage over Google Home and Amazon Echo with its high-fidelity, room-sensing smarts, but now Google Home Max has landed, and it might be even better.

HomePod intelligently scans your room to deliver the perfect sound.
Like HomePod, Google Home Max uses machine learning to analyze your space to deliver optimal sound, but Google’s method gets more granular and fine-tunes the sound based on the song you’re listening to as well as adjusting based on Max’s surroundings. But if they’re even when it comes to sound, the real difference-maker is Google Assistant.
Apple has improved Siri’s speech patterns in iOS 11, but for the most part, its AI ambitions have been relatively conservative. Google Assistant isn’t just better at recognizing what you’re saying, it’s more contextual and conversational, which leads to an all-around better experience. And with the new routines feature, you’ll be able to combine several tasks (like shutting the lights, setting an alarm, and activating your security system) with a single phrase. It even recognizes your voice over the other people in your home. With Siri, commands are islands unto themselves, while Google Assistant is practically like talking to an actual person.
A smarter camera lens
It was just a year ago when Assistant was limited to Pixel phones and Google Home, and now it’s everywhere: headphones, watches, speakers, not to mention hundreds of millions of Android phones.

Google Lens is like AI for you eyes.
And now Google is branching out beyond simple voice commands. Exclusive to the Pixel phones (at least for now) is a new app called Google Lens, and it has the potential to be just as instrumental to Google’s AI push, if not more so. A combination of augmented reality and artificial intelligence, Google Lens uses your phone’s camera to identify buildings or flowers, scan and store phone numbers, even input Wi-Fi passwords, all without needing to jump around to various apps.
This isn’t a fancy box for Google Assistant, it’s a whole new set of skills. Apple doesn’t have anything close to this type of functionality, and Google is set to begin shipping it in a few weeks. If it’s as fast as accurate as it is in Google’s demos, it will be nothing less than a game-charger for search.
Big beautiful brain
Google is in a unique position to excel in the AI space. Where Apple needs to keep wowing us with drool-worthy hardware to get noticed, Google has taken a utilitarian approach to its design, banking on AI to drive the experience. The Pixel 2 isn’t as pretty as the iPhone X or even the iPhone 8 for that matter, but Google is selling brains before beauty.

The Pixel 2 isn’t as pretty as iPhone X, but it’s way smarter.
In its first year, Google Assistant has advanced further than Siri has in the past seven. And it’s getting smarter every day. Not only is it basically on every phone that ships (including iPhones), but it’s in our homes, our cars, on our wrists, gathering information and learning how to better respond to our needs. And with Google Lens, there’s no telling how intelligent it will be this time next year.
Google might never design a phone that’s as beautiful as iPhone X. But one thing is for sure: It’s closer than Apple is to making one that’s smarter than a Pixel.