With 2017 pretty much in the books, it’s time to cast our eyes forward to the year ahead. Apple’s already laid a lot of groundwork for what we can expect to see next year, but there are a lot of blanks still to be filled in. As ever, some of it can be gleaned from the tea leaves of what Apple’s already talked about, though, admittedly, there’s always an element of wishful thinking that plays into it as well.
Based on that heady combination, here are three technologies that I’m looking forward to hearing more from Apple about in 2018.
Chips and dips
Apple’s gotten into custom chips in a big way, especially in its iOS devices. But with the MacBook Pro’s Touch Bar and, more recently, the iMac Pro, it’s also begun to bring its own silicon to the Mac side of the fence. While the T1 and T2 chips in those devices don’t serve as the central processors, they do handle a host of other important functions—especially the iMac Pro’s T2, which in addition to securing the boot process also brings together the system management controller, SSD controller, audio controller, and image signal processing. That’s four critical but disparate functions that can be handled by one chip instead of several.
Not only does this allow more efficient usage of physical space by consolidating chips, but it also works towards Apple’s favorite pastime: closer integration. No need to design around third-party components when you can build them yourself. Seems safe to say this won’t be the last time we hear about Apple’s custom chips appearing in Macs.
Rumors have it that Apple could be working on everything from its own power management chips to its own wireless controllers as well (the latter of which we’ve already seen Apple experiment with via the W1 chip in AirPods). I suspect 2018’s going to be a big year for custom Apple silicon across all of its platforms.
One app platform (not quite to rule them all)
Speaking of endeavours that span multiple Apple platforms, we’ve got that report of a unified application platform for Mac and iOS apps that could in theory happen as early as next year. There’s been some cold water and context brought to this idea, in terms of its ultimate implications, but the fact remains that trying to streamline development of apps between its two platforms is a move that makes sense for both Apple and developers—assuming the company can pull off what others have failed at.
This isn’t as simple as write-once-run-everywhere, but it could potentially provide a way for developers to not have to reinvent the wheel in order to port their apps from one platform to another. (Obviously there are still challenges in having platform-specific UI, and dealing with capabilities that exist on one platform but not the other.) This kind of consolidation, though, strikes me as something we may eventually look back on as a step towards whatever comes after macOS and iOS, in the same way that Apple’s rolling out of custom silicon could potentially bring its different hardware lines into closer proximity.
So keep your eyes peeled for WWDC 2018, because I suspect that we’ll see something announced that will be designed to ease developers who live on both the Mac and the iOS—which, given that all iOS development still happens on the Mac, is pretty much all of them.
Augmented Reality check
Yes, AR made my list of technology to watch last year, and yes Apple did in fact deliver some impressive AR technology this year with the launch of its ARKit framework for app developers. But come on: it seems pretty clear there’s another shoe left to drop.
Even though the impressive horsepower of the latest iOS devices can show ARKit off to great effect—and though, as Tim Cook boldly put forth at its introduction, this immediately made Apple the largest AR platform in the world—there’s still something missing. The experience as it stands today is hampered by the simple fact that it’s constrained to your phone, which you in turn have to hold out in front of you. Rather than being immersed in this new enhanced reality, you’re peering into it through a peephole—staring at your phone rather than at the world around you.
So in 2018 I’m looking for Apple to tell us the other half of the story. Now that developers are out there creating applications and experiences to use with AR, how is it really going to impact our lives? What’s Apple’s narrative for it? Changing up the context of how we experience AR is going to markedly change how it’s perceived—as a gimmick or as the world-changing technology that Apple has promised us it will be.