iOS 16 has arrived and if you’ve already downloaded it on your iPhone then you’re already enjoying the new Lock Screen, Messages, and Focus features. But depending on which iPhone you have, some features might be missing—and not because Apple has delayed them. Depending on your iPhone’s age and processor, various elements of the new operating system just won’t work. Here are the new iOS 16 features you won’t be getting on your device and why.
Lift the subject out of a photo
We’re obsessed with the new feature that lets you cut and lift the subject out of a photo with a single long press. But this is only available on iPhones with an A12 Bionic or better processor, which means you’ll need an iPhone XS, XR or later (or an iPhone SE from 2020 or 2022).
The upgrades for Live Text that let you copy text from paused video and convert currency and translate languages with a single tap are only available if your iPhone has an A12 Bionic or later processor. Again, that means the iPhone XS or later.
Lock screen’s multilayered photo effect
The updated lock screen includes a neat 3D effect that layers photo subjects. But you guessed it, this only works if you’ve got an iPhone XR/XS or later.
Track medication with the camera
iOS 16 includes an update to the Health app that lets you create a list of medications. For added convenience, you can simply hold the label in front of your iOS 16 device’s camera and the details will be entered for you–but only if you’ve got an iPhone XS, iPhone XR, or later.
iOS 16 lets you insert emoji into Siri-composed texts, or at any time when you’re using dictation. But you’ll need an A12 Bionic or later processor, which again means iPhone XS/XR and up.
Siri gets more enhancements in iOS 16, including the ability to hang up calls (including FaceTime calls), a greater range of offline commands, and a discovery feature where you say “Hey Siri, what can I do here?” and it explains what actions are available in a particular app or context. But like the other, the new features require an A12 Bionic or better, which means iPhone XS/XR or later.
‘Fluid’ dictation experience
iOS 16 lets you switch between voice and touch while using dictation: You can type on the keyboard, accept QuickType suggestions and move the cursor without having to leave dictation to do so. But only if you’ve got an A12 Bionic or better, which means iPhone XS/XR or later.
Share ID with apps
As of the iOS 16 update, you can add a driver’s license or state ID to Wallet and then use that to verify your identity or age in relevant apps. (Only the verification will be shared, not your personal data.) This will work with all iPhones compatible with iOS 16 (iPhone 8 or later), but you’ll need an Apple Watch Series 4 or later to get it on your wrist.
Key sharing in Wallet
This is probably obvious, but the clever new sharing features for digital keys depend on implementation on the car/property end as well as in iOS 16. If your car maker/brand hasn’t implemented the change or isn’t participating, for example, this feature won’t be available for you.
There’s a new Home app with a new design and architecture, but bear in mind that 1) all devices that access that smart home need to be on the latest software and 2) for many of the features you’ll need a hub, and if your hub is an iPad, you don’t qualify for the new architecture. For these reasons you may find that the Home app isn’t behaving the way you expect in iOS 16.
Excited by the promised changes to CarPlay? We are too, but they’re not ready yet and won’t be until iOS 17 at the earliest.
Detection Mode in Magnifier
A new Magnifier mode is designed to improve accessibility by detecting and describing nearby objects, including the ability to locate doors and get instructions for how to open them. But this is available only on recent and high-spec hardware: You’ll need an iPhone 12 Pro or iPhone 13 Pro if you want to use the feature on an iPhone. The mode is also available on the 4th- and 5th-gen versions of the iPad Pro 12.9-inch, and the 2nd- and 3rd-gen versions of the iPad Pro 11-inch.
A handy new feature for deaf or hard-of-hearing users: iOS can automatically generated transcriptions of audio and video, and this is even available live, with speaker attribution, in FaceTime conversations. It will be available initially for Canada and US English-language transcription only, however, and requires an iPhone 11 or later. (The feature is also available on any iPad with an A12 Bionic processor or later, and on all Macs with Apple silicon.)
A pair of related upgrades to discuss here: You can now blur foreground objects in portrait photos, and Apple says the depth-of-field effect is more accurate when handling hair and glasses in Cinematic Mode. But only if you’ve got an iPhone 13 or 13 Pro.
Recognition of birds, insects, and statues
Visual Look Up, launched in iOS 15 last year, is an AI-powered search tool that lets you identify plants, animals, and landmarks in your photos. As of iOS 16, it will add birds, insects, and statues to its repertoire–but only if you’ve got an iPhone with an A12 Bionic or better processor. For one final time, that means the iPhone XS/XR or later.