Capture, an iPhone 3D scanner app, hints at the future of augmented reality

The 'Capture' app is far from perfect, but it may be a rough draft of features to come.

capture 3d scanning iphone 2
Leif Johnson/IDG

Today's Best Tech Deals

Picked by Macworld's Editors

Top Deals On Great Products

Picked by Techconnect's Editors

3D scanners don’t have to cost a fortune: As it turns out, you may already have one in your pocket. A new free app called Capture uses the TrueDepth sensors on X-series iPhones to make crude 3D models of small objects, which you can then plop into different settings through augmented reality or share with other folks through iMessage or other means.

Or let’s put it this way: With Capture, I can use my iPhone XS Max to scan a 3D model of Macworld’s old Macintosh SE, being careful to scan each angle I can reach (or until it times out). Once done, I can flip and spin the 3D model around inside the app as though the old Mac were a prop in a video game. I could also send it to a friend, who’d then be able to use her iPhone’s camera see how it looks on her own desk through augmented reality. This will work even with her iPhone 7, which doesn’t have the TrueDepth sensor built in.

macintoshse Leif Johnson/IDG

The real Macintosh SE on the table is in the photo on the left, while the AR version I “placed” on the table is to the right. Unfortunately, the scans are always colorless when used in AR.

Sounds revolutionary, right? Alas, for now it looks as basic as biscuits, which is why you likely haven’t heard of it. Even the publicity images from developer Standard Cyborg look like they’re being disintegrated at the end of Avengers: Infinity War. And as with many other scanning apps, you’re never going to be able to make a true 3D model with Capture’s current version. Even when I could get a decent scan of one side of an object, I wasn’t able to scan its underside.

Facing forward

But don’t blame Standard Cyborg for this. Capture’s awkwardness springs from the current design of the TrueDepth sensors, which are made to read faces from only a few inches away. That means Capture only works well if you’re holding something in your hands (or scanning your face) as the iPhone only has TrueDepth sensors on the front-facing camera.

If you try to scan something in the same way you’d take a video with the rear camera, you’ll be scanning blindly, guided only by subtle vibrations that remind me of notification taps on an Apple Watch. If you move too quickly, Capture ends the scan. As a result, my folder of finished scans looks like a grab bag filled with more tricks than treats. Frankly, I’d be angry if Standard Cyborg were charging any cash for this.

But according to Standard Cyborg co-chief Garrett Spiegel, this is still better than what you’ll find in competing apps. In a conversation on Twitter, Spiegel told me that most competing apps are made by hobbyists who lack his company’s venture financing and engineers with experience in computational geometry or computer vision. These apps are often slow and unstable, he says, and they’re designed so poorly that they even make iPhones run hot.

“By comparison, our algorithms are written in a way such that our app does over 30 frames per second (more than twice the next best app), which means the scanning is more seamless and similar to just taking a panorama (and the speed is already getting better),” Spiegel said.

Spiegel also highlighted Standard Cyborg’s integrated cloud storage system with machine learning models, which makes it easy to import Capture scans into other apps.

Capture is free, so I can appreciate it as a novelty that highlights the potential of TrueDepth. TrueDepth, after all, is one of Apple’s greatest recent innovations: a facial recognition technology that sprays lasers into every dimple and crevasse, and Capture’s images remind us that TrueDepth works a bit like a blind man “seeing” through touch. That makes it years ahead of photo-based facial recognition systems we get from competitors like Samsung. Only recently have Huawei and Xiaomi implemented infrared scanners that pose any serious competition.

Augmenting AR

But Capture also suggests that 3D scanning could be the path to making augmented reality as popular as Apple clearly wants it to be, particularly once Apple figures out how to get TrueDepth to work with the rear cameras. So far Apple (and everyone else) has struggled to show us practical applications with AR: The best we get are realistic dissection apps or games that require a bit too much movement.

But indulge me for a bit. The scanning technology we see in Capture presages a future in which I can scan a 3D model of a mystery auto part to my father, who’d then be able to overlay it with the engine his old Ford Mustang to see if it fits. Or, in keeping with the season, I could scan a Christmas tree topper to my girlfriend, who’d then be able to tell me from home if it’d look as good as I think. It’s AR for the people.

As for Spiegel, he envisions developers using Capture’s scanning technology to make apps that helps consumers figure out their shoe size for a specific brand or find eyeglasses that perfectly fit the shapes of their noses. For that matter, he imagines the technology behind Capture could be used to create virtual avatars for gaming.

capture app scanning tool Leif Johnson/IDG

Capture’s scanning in action. The Macintosh SE was technically too big for a good scan, so I’m impressed that it turned out as well it as did.

This is all sort of science fiction for the moment, but I don’t think we’re too far away from the reality. Sources who spoke with Bloomberg even suggested in 2017 that we could see 3D sensors in the iPhone’s rear camera as early as 2019, although more recent reports from the likes of trusted analyst Ming-Chi Kuo have pushed that date back. But this needs to happen, Spiegel suggests, if AR ever expects to grow more popular.

“We aren’t sure exactly when this will happen, but it seems bound to happen,” he says, referring to the 3D scanning technology on rear cameras. “It’s unclear which technology they will use (TOF, VCSEL, etc), but it’s clear that current passive AR approaches (such as waving your phone around to use it) aren’t going to cut it.”

At the very least, Capture reminds us of the awesomeness of TrueDepth. Animoji and Memoji come close to suggesting TrueDepth is capable of something like this, but even so, most people are only aware of it when they unlock their phones or use Apple Pay or Portrait Mode. Even then, that magic behind the scenes isn’t really apparent.

But technology like Capture's could turn that around. It could also be the feature that makes phone like the iPhone X and iPhone XS more popular than they already are.

“We think this sensor is a HUGE win for consumers and as these practical applications launch and become mainstream in the next few months, it will convince many consumers to upgrade or switch to iPhones that have FaceID/TrueDepth,” Spiegel said.

Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read our affiliate link policy for more details.
  
Shop Tech Products at Amazon