In the era of the personal computer, we worshipped at the altar of processor speeds, RAM capacity, and hard drive space. But in the era of the smartphone, none of these items are as important as they once were. Instead, we are more concerned about what their devices can do, and chief among those capabilities is taking great pictures.
Look back over the past few years of smartphone announcements from industry leaders like Samsung, Google, and, of course, Apple, and you’ll see cameras occupying a lot of time in onstage demos as well as a prominent position in smartphone marketing and ads.
With the unveiling of Apple’s 2019 iPhone line-up a matter of weeks away, many anticipate a big improvement in the devices’ cameras. That’s no surprise: every time a new phone rolls around, Apple always touts it as featuring the best camera ever. Because as good as the current iPhone’s camera is, there’s always room for improvement.
A shot in the dark
When it comes to smartphone cameras, Apple’s biggest competitor is, of course, Google. The Pixel 3 (and especially its XL model) has what many consider the best camera in the smartphone world right now, thanks in no small part to the computational heft that Google has thrown into picture-taking.
One thing in particular has become a standout in comparisons with the iPhone, and that’s the Pixel’s Night Sight feature. Low-light photos have always been a challenge for cameras, for the very simple reason that capturing an image is based on the amount of light. Apple has touted improved low-light performance on several of its iPhones in the past, but Night Sight has blown even the best of them out of the water. A low-light photo is never going to look as good as one taken with more illumination, but Night Sight gets closer than anything else on a smartphone.
The Pixel 3 debuted about a month after the iPhone XS last year, which means that people are going to be watching to see how—and if—Apple responds to its competitor after twelve months. Not offering improved low-light performance would be tantamount to ceding this area to Google, especially with a Pixel 4 on the horizon, so it’s a strong bet that Apple will try to offer its own take on the feature.
Like it or not, the selfie is here to stay—social media has made the self-portraits ubiquitous, and the awkward arm’s-length pose has largely lost any stigma it might once have carried.
While on my travels recently, I took more than my fair share of selfies, and one thing that I found especially frustrating is capturing enough of a scene. That means making sure that not only have you gotten all your human subjects in the picture—at a flattering angle, to boot—but also that you adequately capture the background.
Once again, Google has taken this challenge on with the Pixel 3, which offers an additional wide-angle lens on the front, primarily to help you get all your friends into a single selfie.
Rumors of the upcoming iPhone suggest a wide-angle lens may indeed be coming to the rear of the phone, but there’s been nothing said about a similar front-facing feature, which would be useful not only for the aforementioned selfies, but also for, say, squeezing multiple people into a FaceTime call.
Call them “Panoramores”
On my recent travels, I ended up taking a lot of shots using the iPhone’s Panorama feature. There are some vistas that just can’t be captured by a single photo—they demand a more expansive picture, and a Panorama is all too happy to oblige.
The Panorama feature has improved a lot recently; it does an even better job of seamlessly stitching together the images captured by the iPhone’s camera, but that’s not to say it couldn’t be improved.
For example, I’d like to see a Panorama that could potentially capture even more of a scene, expanding not just horizontally, but vertically as well. There are specialized cameras that can capture a full 360-degree scene, and while that might be too challenging for the limitations of the iPhone camera, perhaps there’s some middle ground for capturing an even more immersive picture by moving the camera up and down as well as side-to-side.
And while we’re on the topic of immersive shots, we’ve all probably had an experience where we tried to take a panorama only to have someone standing in the way, or worse, walking through the scene as we’re capturing it. Best case scenario, there’s just a body we can crop out, but worst case we end up with a distorted person who looks like they got too near a black hole. It would be great if iOS could either provide tools to easily “paint out” those folks, or perhaps even automatically detect a person walking through the shot and remove them without our intervention. After all, the smartest camera features are the ones you never even see.