As we get further away from this year’s Worldwide Developers Conference, the reality of Apple’s latest OS upgrades are beginning to sink in. That’s even more the case this week, as the release of Apple’s public betas for iOS, iPadOS, tvOS, and macOS arrived slightly earlier than expected.
Of course, betas, like the future, are always in motion, and there’s no guarantee that what we see now is what will end up shipping in the fall—but usually the tweaks between then and now are on the minor side, more about stability and usability than big foundational changes.
And so, with the public betas in hand, it’s becoming clear which of our much-hoped-for improvements we won’t be getting this fall. Like many, I have my own personal list of features that I’d hoped to see Apple implement, but am now coming to terms with the fact that I may have to wait for iOS 14.
Audio ins and outs
The iPad has become an ever more capable device, and when iPadOS arrives this fall with enhancements to the Files app and improved multitasking, it’ll be even closer to being able to truly do all the things I need to do for work.

The iPad has a way to go before it can be used as a device for professional audio production.
Except for one big exception: recording podcasts. The most significant limitation right now is the rudimentary way that iOS/iPadOS devices handle audio: though it’s possible to attach an external microphone and have it work, only a single app can access the microphone at once. That means that you can’t use the mic in an application like Skype and record in a separate application at the same time.
The audio architecture for Apple’s mobile devices is in need of some revamping, certainly, but what would be even better is if Apple were to design a system that would allow third-party developers deeper access to audio. On the Mac, apps such as Rogue Amoeba’s Audio Hijack allow you to easily route audio through complex workflows, mixing them together, sending them to different outputs, and making recordings. iOS lags far, far behind by that measure, and it would be great to see apps like Audio Hijack and Loopback make their way to the platform for more robust audio management. But it’s going to need some love and attention from Apple.
More like stere-no pairing
Let’s stick with audio for a moment. I’m probably one of the relatively few to own not just one but two HomePods, which live on my desk as a stereo pair. They’re great for playing music from my iPhone or iPad, or from iTunes on my iMac. But they also sit nearly side-by-side with a pair of Altec Lansing speakers that are connected directly to my iMac, meaning that a significant portion of my desk is occupied by speakers.

iTunes is the only software on macOS that recognizes a HomePod stereo pair.
So why have both the HomePods and wired computer speakers? Simple: the only software on macOS that recognizes a HomePod stereo pair is iTunes. The system audio output treats the HomePods as separate devices, letting you output only to one at a time. (And even outputting to a single HomePod seems to introduce some sync problems if you’re trying to watch video.) This is a bit of a head-scratcher for a company that prides itself on how well its products work with one another.
I have no doubt this isn’t particularly a high priority for Apple—I’m sure not a lot of people are dying to spend an extra $600 for computer speakers. But if you’ve already shelled out that money for a pair of HomePods, it would be nice if you could at least get a little more bang for the bucks you’ve already spent.
Pulling focus
I hesitate to mention this last one, because it’s the kind of seemingly small change that could very well be fixed in a beta, but it still boggles my mind that Apple hasn’t already addressed it: app focus while multitasking on the iPad while using a keyboard.
Let’s say you’re running two apps side-by-side in Split View (something which will no doubt become even more commonplace with the new multitasking features in iPadOS). Take Safari and Mail, for example. You hit Command-N and…what happens? You start a new mail message? An empty Safari tab opens? Frankly, it’s a crapshoot.

iOS’s Split View could use some enhancements to make it more usable.
That’s because there’s no visual indicator telling you which of your two side-by-side apps currently has the focus. This has always been part of macOS: the active window looks different from all the windows in the background. But iOS and iPadOS have never really had a “window” concept until multitasking features arrived and, for some reason, the idea of a visual cue never made its way over. (There are tricks like holding down the Command key to bring up the list of keyboard shortcuts, or using Command-Tab to bring up the app switcher, but they’re both cumbersome and require user action).
Moreover, deliberately changing which app has the focus isn’t as simple as tapping on that app, à la clicking on a window in macOS. Instead, you need to actually interact with the app in some way: tap a button, put the cursor in a text field, or so on. All of this feels a little clumsy, especially given that it’s been a solved problem on the desktop for so long.
Perhaps Apple will indeed address this over the next few months, as it iterates from beta to beta, but my gut says probably not. Ah well, there’s always next year.