The Galaxy Note 8 is the first Samsung phone to feature a dual lens camera system. Similar to Apple’s iPhone 7 Plus, it includes a telephoto lens paired with a standard lens. This allows both phones to deliver fun depth of field effects—but does one company do bokeh better? Let’s check out the differences between their approaches, and see if one phone can emerge victorious.
Apple’s “Portrait Mode” and Samsung’s “Live Focus” use their dual camera systems to gauge depth in a scene and introduce bokeh, or blur, into a photo taken with the telephoto lens. This mimics high-end DSLRs and creates a stunning effect when done properly. But we’re talking about smartphone cameras here, so let’s first dig into the phones’ not-so-DSLR-caliber specs.
On paper the differences may seem slight, but the two phones differ in some drastic ways. Both the Note 8 and the iPhone 7 Plus have dual 12-megapixel sensors. Both have an effective 2x optical zoom between each camera. And both sport optical image stabilization (OIS) on their main lens. But that’s where the similarities stop.
The Note 8 features larger, dual pixel sensors (1.4μm vs 1.2μm). It also includes OIS on the telephoto lens—a first for any smartphone, and very important for handheld shooting. The Note 8 also features faster apertures in both lenses: The main camera is f/1.7 (Apple’s is f/1.8), while the telephoto is f/2.4 (Apple’s is f/2.8). A faster aperture allows more light to reach the sensor resulting in a less noisy image.
But specs only tell half the story, because great software can easily overcome inferior hardware. And if the experience of actually using the cameras is poor, I don’t care how great the camera is, I don’t want to use it.
Let’s get into user experience first. Switching to the iPhone 7 Plus’ Portrait Mode is simple, but takes a second to kick in. Once it does, Apple’s interface is really good in telling you how to reach the sweet spot of the mode. The software recommends that you be about 8 feet from your subject in order for the effect to trigger, and you’ll notice a box on the interface turns yellow when you can snap the portrait. If the conditions aren’t right, Apple tells you what you need to change in order to get the best results.
Samsung? Not so much. After launching the camera, you just need to tap on the Live Focus option, which launches super quick. Once activated, you’re prompted to stay around 4 feet away from your subject in order for the effect to trigger. But when the effect doesn’t trigger, I find the prompts to be too vague to be helpful.
Interface quirks aside, the Note 8 does have a couple nice tricks up its sleeve. First, you can adjust just how much bokeh is introduced into the scene. There’s a handy slider to see, in real time, just how much you’re affecting the shot. And to take it to another level, the same thing can be done after the photo is snapped! From the gallery app you can save as many different variations as you want since all the info is already embedded in the capture. The Note 8 also saves the photo from the main camera, just in case you want a different perspective of the scene you shot.
Apple doesn’t have any of these options, but I’m hoping they come in a software update or appear native in one of the new iPhone models announced next week. For now, you can only select to have the 7 Plus save a second version of the photo with the effect turned off.
So how do the photos look? Let’s compare images of our fabulous model Cyndal, shot on the street in San Francisco. The light was changing rapidly that day so the exposures are a bit different from shot to shot. Nonetheless, both phones were able to capture pleasing images with a nice amount of depth.
But we do see a problem already. The iPhone locks the face in focus and adds blur to everything, even objects that are in the same plane of focus as her face. You can see in the iPhone photo below that the detail on her top is lost.
I’ve noticed this quirk ever since Portrait Mode was introduced last year, and I’m still very confused as to why Apple makes this decision. It’s not how DSLR’s work, and it feels just sloppy. The Note 8 on the other hand has the model—top and all—in focus, with nice separation from the wall behind her.
Moving to a second street scene, we notice a couple more things. First, the Note 8 struggles with edge detection on hair just like the iPhone. But more importantly is what’s happening to the garage in the top right corner of the frame.
The iPhone is blurring the lines evenly, whereas the Note 8 is understanding the depth of the scene and blurring them in a gradient fashion. This suggests that the Note 8 is able to more accurately represent depth than the iPhone.
Checking out this grassy location reveals some of the same quirks. Both phones continue to struggle with complex scenes, and as a result they keep some things in focus that should be blurred, and vice versa.
Zooming into the top left corner we also notice that at it’s maximum depth, the iPhone 7 Plus is far more blurrier than the Note 8.
This last example in front of the garage door again shows the weaknesses of both cameras. The Note 8 was trying—and failing—to gather what was supposed to be out of focus, resulting in weird patches of blur.
And while the iPhone blurred everything, including her top, in equal amounts, it also put borders around the model (check out her hair in the shot above) and on the edges of the frame in an unnatural way.
So what did we learn from all these comparisons? We learned both companies still have a long way to go in perfecting their depth of field modes. Samsung has leapfrogged Apple in terms of features and does way better at gaging the whole depth of a scene. But Apple’s mode is easier to use and will most likely be updated soon with new hardware and software.
Either way, I’m super happy that both companies are continuing to push mobile photography forward. Because the best camera is the one you have on you, so why not have it be awesome!