The Galaxy Note 8 is the first Samsung phone to feature a dual lens camera system. Similar to Apple’s iPhone 7 Plus, it includes a telephoto lens paired with a standard lens. This allows both phones to deliver fun depth of field effects—but does one company do bokeh better? Let’s check out the differences between their approaches, and see if one phone can emerge victorious.

Apple’s iPhone 7 Plus came out late last year while Samsung’s Galaxy Note 8 launches in September.
Specs
Apple’s “Portrait Mode” and Samsung’s “Live Focus” use their dual camera systems to gauge depth in a scene and introduce bokeh, or blur, into a photo taken with the telephoto lens. This mimics high-end DSLRs and creates a stunning effect when done properly. But we’re talking about smartphone cameras here, so let’s first dig into the phones’ not-so-DSLR-caliber specs.
On paper the differences may seem slight, but the two phones differ in some drastic ways. Both the Note 8 and the iPhone 7 Plus have dual 12-megapixel sensors. Both have an effective 2x optical zoom between each camera. And both sport optical image stabilization (OIS) on their main lens. But that’s where the similarities stop.

The iPhone 7 Plus is the first phone from Apple to feature a dual lens system.
The Note 8 features larger, dual pixel sensors (1.4μm vs 1.2μm). It also includes OIS on the telephoto lens—a first for any smartphone, and very important for handheld shooting. The Note 8 also features faster apertures in both lenses: The main camera is f/1.7 (Apple’s is f/1.8), while the telephoto is f/2.4 (Apple’s is f/2.8). A faster aperture allows more light to reach the sensor resulting in a less noisy image.

The Galaxy Note 8 takes what makes the S8 and S8+ great, and adds a dual lens system, a first for Samsung.
But specs only tell half the story, because great software can easily overcome inferior hardware. And if the experience of actually using the cameras is poor, I don’t care how great the camera is, I don’t want to use it.
Usability
Let’s get into user experience first. Switching to the iPhone 7 Plus’ Portrait Mode is simple, but takes a second to kick in. Once it does, Apple’s interface is really good in telling you how to reach the sweet spot of the mode. The software recommends that you be about 8 feet from your subject in order for the effect to trigger, and you’ll notice a box on the interface turns yellow when you can snap the portrait. If the conditions aren’t right, Apple tells you what you need to change in order to get the best results.

Apple makes Portrait Mode on the iPhone 7 Plus very user friendly.
Samsung? Not so much. After launching the camera, you just need to tap on the Live Focus option, which launches super quick. Once activated, you’re prompted to stay around 4 feet away from your subject in order for the effect to trigger. But when the effect doesn’t trigger, I find the prompts to be too vague to be helpful.

The Live Focus mode on the Note 8 struggles to provide clear details about what you need to fix when it doesn’t work.
Special features
Interface quirks aside, the Note 8 does have a couple nice tricks up its sleeve. First, you can adjust just how much bokeh is introduced into the scene. There’s a handy slider to see, in real time, just how much you’re affecting the shot. And to take it to another level, the same thing can be done after the photo is snapped! From the gallery app you can save as many different variations as you want since all the info is already embedded in the capture. The Note 8 also saves the photo from the main camera, just in case you want a different perspective of the scene you shot.

Samsung included some nifty features into it’s Live Focus mode, including the option to see a photo taken from the wide lens.
Apple doesn’t have any of these options, but I’m hoping they come in a software update or appear native in one of the new iPhone models announced next week. For now, you can only select to have the 7 Plus save a second version of the photo with the effect turned off.
Photo results
So how do the photos look? Let’s compare images of our fabulous model Cyndal, shot on the street in San Francisco. The light was changing rapidly that day so the exposures are a bit different from shot to shot. Nonetheless, both phones were able to capture pleasing images with a nice amount of depth.

Both the iPhone 7 Plus (left) and Galaxy Note 8 (right) add depth to the scene, drawing your eye to the model.
But we do see a problem already. The iPhone locks the face in focus and adds blur to everything, even objects that are in the same plane of focus as her face. You can see in the iPhone photo below that the detail on her top is lost.

Apple (left) blurs everything but the models face. Samsung (right) keeps items in the same plane of focus in focus—the way it should be. Click to enlarge.
I’ve noticed this quirk ever since Portrait Mode was introduced last year, and I’m still very confused as to why Apple makes this decision. It’s not how DSLR’s work, and it feels just sloppy. The Note 8 on the other hand has the model—top and all—in focus, with nice separation from the wall behind her.

iPhone 7 Plus (left) and the Galaxy Note 8 (right) both have a hard time with edge detection when it comes to hair. Click to enlarge.
Moving to a second street scene, we notice a couple more things. First, the Note 8 struggles with edge detection on hair just like the iPhone. But more importantly is what’s happening to the garage in the top right corner of the frame.

Apple says it takes a limited number of samples around a person’s head and applies the same amount of depth to the rest of the scene. Samsung seems to gather even more depth info. Click to enlarge.
The iPhone is blurring the lines evenly, whereas the Note 8 is understanding the depth of the scene and blurring them in a gradient fashion. This suggests that the Note 8 is able to more accurately represent depth than the iPhone.

In such a complex scene both the iPhone 7 Plus and the Note 8 have a hard time knowing what should be in focus and what shouldn’t. Click to enlarge.
Checking out this grassy location reveals some of the same quirks. Both phones continue to struggle with complex scenes, and as a result they keep some things in focus that should be blurred, and vice versa.

There is far more blur applied to the image coming out of the 7 Plus compared to the Note 8.
Zooming into the top left corner we also notice that at it’s maximum depth, the iPhone 7 Plus is far more blurrier than the Note 8.

This garage door proves to be an interesting test for both the iPhone 7 Plus and Galaxy Note 8.
This last example in front of the garage door again shows the weaknesses of both cameras. The Note 8 was trying—and failing—to gather what was supposed to be out of focus, resulting in weird patches of blur.

There’s a halo effect around the model and the edge of the frame on the iPhone 7 Plus. But the Note 8 chose to only blur some of the garage door, making abstract shapes in the pattern.
And while the iPhone blurred everything, including her top, in equal amounts, it also put borders around the model (check out her hair in the shot above) and on the edges of the frame in an unnatural way.
Conclusion
So what did we learn from all these comparisons? We learned both companies still have a long way to go in perfecting their depth of field modes. Samsung has leapfrogged Apple in terms of features and does way better at gaging the whole depth of a scene. But Apple’s mode is easier to use and will most likely be updated soon with new hardware and software.
Either way, I’m super happy that both companies are continuing to push mobile photography forward. Because the best camera is the one you have on you, so why not have it be awesome!

The future of smartphone photography is tied to the future of machine learning. The more information a camera can gather from a scene, the more it can manipulate it.