A fascinating report at 9to5Mac offers a sneak peek at some of the announcements Apple has in store for developers at WWDC in June. Previous reports have focused on changes coming to iOS 13 and macOS 10.15, but this leak is all about the tools developers use to make apps and services on Apple’s platforms. If they prove true, it could mean a big improvement in the way we use our iPhones, iPads, and Macs.
Siri stretches out
We’re certainly not the only ones lamenting the current state of Siri and the way Apple squandered an early lead in AI assistants. Apple’s got a lot of work to do to make Siri better in many ways, but the 9to5Mac report gives us a clue that at least some of Siri’s shortcomings may be addressed in the new versions of Apple’s operating systems.
The report claims that developers can make use of new Siri intents, including “media playback, search, voice calling, event ticketing, message attachment, train trip, flight, airport gate and seat information.”
Third-party developer support is divided into domains (broad categories like “Fitness” or “Messaging”) and intents (specific functions your app would like to enable with Siri, like “start a workout” or “send a message”).

Siri works great with Apple Music. Maybe with iOS 13, it will work great with other services, too?
This list of new intents is exciting in part because it seems to hint at new domains.
When it comes to support non-Apple apps, services, and devices, Siri’s got one big problem: it doesn’t work with enough stuff. Look at Apple’s developer guidance for domains and intents and you’ll see obvious missing pieces.
For example, there’s no media playback domain at all. Which is why you can use Siri to control your device with generic commands like “next track” and “volume up” but you can’t use Siri to control any media playback other than Apple Music with commands like “play my Discover Weekly playlist on Spotify” or “play Queer Eye on Netflix” (the latter command just opens the show’s page in the TV app, rather than jumping right into Netflix).
Apple needs to do more than just give developers more ways to hook their apps and services into Siri. Siri needs better voice recognition, and improved ability to answer general questions, and vastly expanded HomeKit integrations. Maybe those things are coming as well, but at the very least, it’s great to see Apple opening Siri up to more integrations with more kinds of apps.
Expect lots of iOS-born apps to your Mac
In macOS Mojave, we have only four apps, all made by Apple, that make use of its “Marzipan” framework for bringing iOS app to the Mac: News, Voice Memos, Stocks, and Home. Apple calls this a “sneak peek” at the technology, a way for the company to test what it means for the Mac to run apps made with the UIKit tools made for iOS developers with a minimum of code changes.
The 9to5Mac story paints a picture sure to result in an absolutely explosion of UIKit apps apps on your Mac.
Perhaps the most shocking statement is this: “Enabling Mac support for an existing iOS app is as easy as checking a checkbox in the target settings in Xcode, much as you would to add iPad support to an iPhone-only app.”

Last year, Apple told us it’s bringing Mac and iOS development closer together. This year, it may reveal that they will be closer than we imagined.
If it really is that easy to bring iOS-native apps to the Mac, we can expect a near-instant flood of thousands such apps. I suspect that 9to5Mac’s source was being a little hyperbolic, and that there are certain requirements that must be met before a developer can port an iOS app to Mac with just a checkbox. Apps that use some third-party libraries probably won’t be so easy to bring over. And Apple will probably have design requirements for app approval, like making sure it works properly with mouse and keyboard.
To help prevent UIKit apps from feeling like running an iPhone emulator on your Mac, Apple is adding some significant new API for developers. UIKit apps will reportedly be able to access the touch bar and menu bar, and open multiple windows on the Mac. iPad apps that support Split View will resize more like regular Mac apps do. These additions alleviate most of the biggest concerns from Mac fans who are disappointed by the four UIKit apps Apple released with macOS Mojave.
With these features, it’s not at all hard to imagine that, a year from now, we will all regularly use several Mac apps that are born from iOS apps, especially as developers come to grips with how to use cloud storage to keep your data in sync and learn to adjust their interfaces appropriately to each platform.
Hopefully, Apple is planning to better unify its iOS and Mac App Stores for this brave new world. We want to be able to buy an app once and get it on all supported platforms (if the developers want to sell it that way). We want know if that app we’re looking at on our iPhones has a Mac app, and vice-versa.
This could have a huge impact on Mac sales. The iOS market is many times larger than the Mac market. When Apple can run ads showing how their laptop runs all those apps you love from your iPhone, it may be the strongest selling point for Macs in a decade.
Augmented reality continues to improve
Apple is serious about augmented reality (AR), and continues to push hard on improving its ARKit tools for developers.
We saw a lot of improvements in ARKit 2 with iOS 12, and it looks like iOS 13 will continue to make AR apps more powerful an easy to make. Apple is adding a new Swift-only framework with a companion app that lets developers create AR apps in a more visual way.

ARKit, and its apps, continue to improve. But it won’t really take off until Apple integrates augmented reality into its core apps.
ARKit will be able to detect human poses, which will be critical for making virtual objects interact with real people. And game developers can make AR apps that use touchpads and deliver AR sound with stereo headsets.
ARKit has seen some mixed success. There are quality AR apps out there, and the technology works well, but it’s not yet the sort of thing the average person uses on a daily basis. Apple’s only real built-in AR app is Measure, which is definitely not the transformative experience AR needs to go mainstream.
Better tools for developers is key, but it’s also going to be important for Apple to integrate augmented reality into its own core apps of the iPhone, and make it truly useful.
Building a stronger foundation for NFC, haptics, and machine learning
Currently, the NFC tools for iOS only allow developers to read tags formatted with the NFC Data Exchange Format (NDEF). This will be expanded to allow devs to read ISO7816, FeliCa, or MiFare formats.

Apple has been way behind Android in support for NFC standards, but that could change this fall.
That’s going to allow developers to make iPhone apps that work with a much larger variety of card-tap systems. From student IDs to vending machines to public transportation passes and even corporate lock keycards, the vast majority of NFC stuff in the world uses one of those four formats. With this expanded NFC support, the right apps could allow you to replace nearly every card in your life.
The iPhone’s Taptic Engine is a visceral feedback experience unmatched by any other smartphone. Until now, developers have had very limited ability to create their own haptic feedback. The 9to5Mac story claims that devs will be given a lot more control over the Taptic Engine, which will make our apps feel better. Literally.
Apple’s machine-learning framework, Core ML, is getting an update, too. Currently, developers train a machine-learning model, and then deploy it, using Core ML to run that fixed model in their apps. The new update will allow developers to actually update the model on device. This will allow apps that use machine learning to get smarter or more accurate without requiring an app update.
Other parts of Core ML will be enhanced, too. The machine vision get a built-in image classifier, and developers will be able to use Core ML to analyze audio.
These changes mean a lot more flexibility for developers, so it’s likely we’ll see more apps that use Core ML to leverage the powerful Neural Engine hardware in modern iPhones.
On the Mac side, there’s a new API for writing device drivers and new file provider extensions that should help with the way cloud services integrate with Finder.
Welcome into the walled garden
Taken individually, each of these new features is a nice update that could enhance certain kinds of apps. If you look at them as a whole, a pattern emerges.
Apple isn’t exactly tearing down its “walled garden” ecosystem, but it is making it easier for more apps and services to visit and mingle. Our iPhones and iPads and Macs will, simply put, “work with more stuff.” More services, more hardware. Development tools are always meant to allow app makers to make better apps, and these are no different. But there appears to be a concerted effort to help make sure those apps work better on your Apple device, to allow them to integrate more completely.
Combine these alleged developer tools updates with previous rumors about iOS 13 and macOS 10.15, and Apple’s fall operating system releases start to look like a turning point in the way we use Apple gear. A “better together” approach where Apple welcomes more services, standards, and apps, while simultaneously strengthening the bond between its devices.
This doesn’t mean it will be easier to leave the Apple ecosystem. Quite the contrary, really: You’ll have less incentive to leave when your favorite apps and services work better with your Apple device, and when they more seamless work together.