The iPad Pro has frequently been an incubator for technology that Apple ultimately plans on rolling out to the rest of its product line. Last year, the iPad Pro got a LiDAR scanner months before it appeared in the iPhone 12. This year’s 12.9-inch model introduced the mini-LED screen technology that will probably be showing up very soon in a new line of MacBook Pro laptops. And this fall’s iPhone Pro models are rumored to come with high-refresh-rate displays, pioneered years ago on the iPad as ProMotion.
But there’s another core Apple technology of the future that’s currently available only on the newest iPad Pro. And I’m confident that, in the next couple of years, you’ll see it spread across most (but not all) of Apple’s products: Center Stage.
Center Stage uses machine-learning technology to pan and zoom in a camera’s field of view to get the perfect shot during a FaceTime call or other videoconference. It will zoom in on a single subject, or zoom out to find every person in the frame. If you haven’t tried Center Stage, you’ll need to trust me: It’s great. And having experienced it for months on my iPad Pro, I now want it everywhere. It’s too good a feature not to be, and as soon as possible.
The obvious destination: Mac
I like the new 24-inch M1 iMac, which was announced on the same day as the new iPad Pro. But it’s so frustrating that Apple introduced Center Stage on the iPad Pro and omitted it from the iMac. (Chalk that one up to parallel product development, I guess—I suspect that the iMac’s design predates the iPad’s, despite them arriving simultaneously.)
Really, it’s only a matter of time until Center Stage appears on the Mac. The technology is perfect for any device that generally stays still while people position themselves around its front-facing camera. The iMac is perfect for this, which is why I’d expect that the next iMac we see–presumably a replacement for the 27-inch Intel iMac that Apple is still selling–will support Center Stage. And that eventually the 24-inch model will be updated to support it, too. It just makes too much sense in a device that will be stuck on a desk or kitchen counter.
But I think Center Stage will make sense on Apple’s laptops too. While I will occasionally take part in a videoconference with a MacBook on my lap, most of the time I’ll set the laptop on a table or chair. Families with Mac laptops enabled with Center Stage can turn them into great FaceTime devices just by placing them on the coffee table.
To me, the only question is when Center Stage will arrive. On the software side, I suspect Apple’s already got this covered—the code that’s running on the iPad Pro is, after all, already optimized for Apple Silicon. The real question is hardware. Center Stage takes advantage of an ultra-wide front-facing camera, so it can capture as much of a room as possible and then dynamically crop and remove distortion to give you the sense that there’s a camera operating zooming and panning. Apple’s been slow to upgrade the front-facing cameras on Macs, but I very much hope that this next round of redesigned laptops leaves space for a high-resolution ultra-wide camera capable of being an excellent source for Center Stage.
Other anchored devices could benefit
A few months back, there were rumors that Apple was testing out a future HomePod that would have a screen. As a user of an Amazon Echo Show, I can see the appeal of such a device. But once Center Stage came on the scene, the shape of an Apple product strategy came into view.
An Echo Show, HomePod, or any other device that might be parked in a kitchen or living area has a real liability: it’s probably not going to get picked up and moved around. And yet, when you consider a device in this class, it’s hard not to imagine video calls being a key use case. I’ve tried video calls on my Echo Show, but the angle is lousy and Amazon’s calling software isn’t worth my time.
But when I imagine a similar device, built by Apple and with a camera equipped with Center Stage, it gets a lot more interesting. That device, anchored in one position, can follow you around the room as you talk. It’s a perfect match. (I also think this would be a great match for a future Apple TV model or Apple TV camera accessory that sits atop the TV screen, to enable FaceTime calls on the big screen—while keeping the actual picture it’s shooting dynamic and interesting.)
Dynamic devices? Not so much
As for the rest of Apple’s product line, I’m not so sure. I don’t really see how most iPhone use cases would work with Center Stage since you hold an iPhone in your hands and can position it as you like. You’re the camera operator, in that case.
And then there’s the Apple Watch, which obviously doesn’t need Center Stage—or does it? Maybe not the Center Stage we know now, but I wonder if the same techniques that Center Stage uses to remove distortion from a wide-angle lens might allow a camera mounted somewhere on an Apple Watch to create a more pleasing image that looks more like a professionally shot video and less like a camera pointing up someone’s nose. Apple can only do so much, but the power of this technology is strong.