The pros and cons of an Apple-Intel divorce
Rumors persist that Apple will switch the Macintosh from Intel’s x86 processors to ARM-compatible chips like those in the iPhone, iPad, and other iOS devices. Is that possible? Is it even a good idea?
The rumors didn’t come out of nowhere. In late 2010, then-CEO Steve Jobs suggested that iOS and OS X would eventually merge. In mid-2011, Apple read Intel the riot act about its power-hungry Core chips in the Mac line and hinted it might move the Macs to ARM-based CPUs.
Perhaps mindful of Apple’s previous success in changing chips—first from the Motorola 680x0 to the IBM/Motorola PowerPC in 1994, then from the PowerPC to the Intel x86 in 2005—Intel quickly reminded the industry it had power-efficient “Ivy Bridge” Core processors and “Medfield” Atom processors in the works. “Ivy Bridge” began shipping in mid-2012, powering both Macs and Windows Ultrabooks.
What’s behind the “abandon Intel” rumors
Despite Intel’s moves to make the x86 more power-efficient, the rumors that Apple will dump x86 processors continue, perhaps because of slow adoption of Intel’s Atom processors in smartphones or because Apple keeps making its own A-series ARM-based CPUs in the iPad even faster. But persistent rumors can be wrong. After all, we still hear that NASA faked the Apollo moon landings and that Elvis Presley lives.
Although the Apple rumor is more plausible, that doesn’t make it true. And even if reports are accurate that Apple already has iOS up and running on ARM-based Macs in the lab, the switcheroo won’t necessarily happen. Lab geeks are always up to something. Besides, the rumor has various incarnations. Some say Apple will abandon OS X and switch the Mac to iOS without changing the processors. Others invert the rumor, predicting that Apple will switch iOS devices to the x86 when Intel’s Atom processors are more competitive with ARM on power efficiency.
The common ground beneath all this speculation is that Apple will eventually merge its two major product lines—Macintoshes and mobile devices—with a single microprocessor architecture, operating system, software-development environment, and app store. For both Apple and users, life will become simpler. In some ways, this grand unification theory makes sense, which lends it credibility. Even if it doesn’t make sense, it could still happen—if it was a last wish of Steve Jobs.
The need to merge iOS and OS X is questionable
First, supporting two product lines with different microprocessor architectures and operating systems is not a great burden for a large, rich company like Apple. Each product line generates enough revenue to justify some duplication of resources. In fact, such redundancy is better than compromising performance or design flexibility by forcing the products into a one-size-fits-all platform. Apple knows this—if the bean counters ruled Apple, its trend-setting products would be as boring as those from many of its clueless competitors.
Another point to consider is that Apple CEO Tim Cook has already tried to quash at least one version of the “merged iOS and OS X” rumor. In December, he told Bloomberg Businessweek, “We don’t subscribe to the vision that the OS for iPhones and iPads should be the same as Mac. Customers want iOS and Mac OS X to work together seamlessly—not to be the same, but to work together seamlessly.”
That statement appears to rule out an OS merger, which would be the easiest path to unification. Essentially, Apple created iOS by forking OS X, so the two operating systems already have much in common. Microsoft is moving in the opposite direction, converging Windows with the Windows Phone OS—as people confounded by the new Windows 8 user interface will attest.
If Cook’s word is good, OS X and iOS will continue to exist as parallel universes. Unification, if it happens, will harmonize the low-level hardware, not the operating system: Either Macs will get ARM processors or iOS devices will get x86 processors. Good arguments abound for both scenarios.
Why Apple is unlikely to switch the Mac to ARM processors
For about two years now, the tech industry has thrilled to a rematch of the RISC-versus-CISC wars of the 1990s (embodied by RISC processors like the PowerPC and CISC processors like Intel’s Pentium). The war fizzled on the desktop, and by the mid-2000s, ARM’s RISC architecture dominated 32-bit embedded processing (especially in mobile devices) and Intel’s x86 architecture dominated PCs and servers. Frankly, much of the current buzz is fueled by the industry’s hope that someone, anyone, will mount a serious challenge to Intel, especially now that AMD is suffering another funk.
Compared with Intel, ARM is a puny company, but its licensing model is a force multiplier that allows ARM-based processors to outsell x86 processors by about 20 to 1. Anybody with money can license off-the-shelf CPU cores from ARM to design a chip or license the architecture to design an ARM-compatible CPU core. Several companies will manufacture those designs for you.
One of those architecture licensees is Apple, which is now designing its own A series of CPUs for iOS devices instead of buying chips designed by outside suppliers like Nvidia, Qualcomm, or Samsung. Intel will not license the x86 architecture, so if Apple wants to design its own Mac processors, ARM is the logical alternative.
One obstacle, however, is that no one has ever created an ARM processor as powerful as Intel’s best PC processors. ARM fanboys tend to overlook this inconvenient fact. In theory, it’s possible, of course. Mainly, no one has tried.
In the past, ARM and its licensees focused on minimizing power consumption, not maximizing performance. Low power usage is much more important for mobile devices than it is for desktop PCs, which draw their power from AC sockets and dissipate the heat using multiple fans or, in Apple’s case, aluminum shells. Even lightweight notebook PCs can tolerate hotter chips than smartphones and tablets.
Until 2011, ARM didn’t even have a 64-bit architecture. ARM’s first 64-bit chips are still under development, expected to reach the market this year. They are intended for lower-power servers, which will allow comparisons with competing x86 processors. No company can defy the laws of physics; as ARM processors grow more powerful, they will inevitably need more juice. The critical factor is the power/performance ratio—the amount of processing performed per watt. If there’s hope for ARM to crack the desktop, it will come from delivering more performance per watt than Intel.
Intel’s ace is superior manufacturing technology, which is about four years ahead of everyone else’s. In the semiconductor industry, that’s huge. Besides its lead in process geometry—Intel is comfortably mass-producing chips in a 22-nanometer process while the rest of the industry is adopting 28nm—Intel is using trigate (3D) transistors called FinFETs, whereas everyone else is still using planar (2D) transistors.
This manufacturing lead is an enormous advantage that ARM-based processors must try to overcome with superior design and efficiency. Yes, the ARM architecture is more efficient than the x86 in some ways, but that’s a slimmer advantage than Intel’s manufacturing prowess.
Another factor is Intel’s relentless design pipeline. The company introduces new or improved processors every year and rarely misses an announced production date. Any company hoping to compete with Intel must not only create a superior design, but also follow that chip with even better designs on a similar schedule. As AMD can tell you, it ain’t easy. Although Apple has much more cash to spend than AMD, it’s doubtful that Apple employs enough chip-design expertise to match Intel’s aggressive pace.
Over time, switching the Macintosh from x86 to ARM could doom the computers to inferior performance. Apple remembers full well how its vaunted PowerPC chips in 1994 lost their lead against Intel chips within a few years, as IBM and Motorola fell further and further behind Intel’s aggressive x86 improvements. As a result, Apple dumped the PowerPC for x86 in 2005. (That’s about when Apple stopped using “Power” in most of its product names and switched back to just “Mac.”)
The fact is that Apple was able to move from Motorola 680x0 chips to the PowerPC in 1994 and from PowerPC to x86 in 2005. Apple proved twice it could make a drastic forklift upgrade to its Mac hardware without wrecking its software ecosystem. So speculators’ thinking goes today, why not again?
In those two big chip transitions, Apple prevailed by using nearly transparent emulation and other clever tricks, like “fat binaries” that bundled two versions of the same program in a single executable package, one for each processor. True, each transition took software developers on a wild ride, but most users found the switch tolerable and often seamless.
Still, the difficulty of swapping CPU architectures should not be underestimated. Although Apple can undoubtedly do it again, switching the Mac to ARM may not gain it the same advantages as previous switches. Remember that in the two previous switches, the old architecture was falling way behind the performance curve, but that’s not so with the x86 today. Indeed, Intel is still leading the curve.
Why Apple is unlikely to switch iOS to Intel processors
Rumors that Apple will port iOS to the x86 are abetted by the steady improvement of Intel’s Atom-based chips for smartphones and tablets. Keep in mind that Intel and ARM started from opposite ends of the spectrum: Intel from high performance, ARM from low power. Now each company is moving toward the other’s position and have nearly met in the middle.
Intel’s Atom processor, introduced in 2008, was the company’s first attempt to drastically reduce x86 power consumption. Initial Atom chips were great for netbooks but still way too hot for smaller devices like smartphones. In 2010, Intel’s “Lincroft” processor (officially, the Atom Z600 series) moved graphics, video coding, DRAM control, display control, and I/O interfaces onto the same chip as the Atom CPU core. For Intel, this was a big step in chip integration and power efficiency, but “Lincroft” still ran relatively hot and required a companion chip to match the functions of competing single-chip processors.
In 2012, Intel introduced “Medfield” (officially, the Atom Z2460). This chip finally integrated the critical smartphone application functions on a single chip and cut power consumption to competitive levels. Even so, it’s not quite good enough to convince most smartphone vendors to rewrite their ARM software for the x86. In fact, Intel has had to take on the work of porting Android to Atom; despite that effort, very few smartphones or tablets use it—even though most Android apps are written in Java, which bridges chip platforms more easily than natively compiled code.
The next generation could tilt the balance in Atom’s favor by fully exploiting Intel’s manufacturing advantage. Up to now, Atom chips have lagged behind the company’s PC and server processors. Whereas Intel manufactures its leading PC and server chips in the latest 22nm FinFET process, “Medfield” gets by with the previous-generation 32nm planar process.
From a business standpoint, this strategy makes sense, because PC and server processors sell at higher prices and in higher volumes. Future Atom chips, however, will no longer be hand-me-downs. Atom chips are moving to 22nm technology this year and to next-generation 14nm technology in 2014. ARM manufacturers simply can’t match that pace.
But even if Atom can beat ARM’s power/performance efficiency, Apple may not switch iOS to the x86. All those zillions of apps for iPhones and iPads are natively compiled for ARM. Either software developers would have to recompile them for the x86 or Apple would have to provide an ARM-on-x86 emulator. Although Apple has successfully used emulation to smooth the Mac’s platform transitions, an emulator’s overhead in CPU clock cycles, memory, and power would be a greater burden for a mobile device. Unless an Atom processor can emulate ARM faster than an ARM processor can run its own native code, Apple would have little or no technical reason to switch CPU architectures.
Don’t rule out the Steve factor
So far, all this analysis overlooks one vital factor: Steve Jobs. Although Apple’s co-founder no longer dwells among the living, his famous “reality-distortion field” radiates from the grave.
Jobs never liked sharing profits with anyone, which is why Apple’s platforms are walled gardens. Apple’s “curated” model of app-store merchandising exerts almost total control over software distribution while keeping a large slice of the revenue. That model started with iTunes, surged with the iPhone and iPad, and is gradually encompassing the Mac. Now, instead of buying readily available off-the-shelf chips, Apple is designing its own ARM CPU cores and iOS application processors—difficult projects that cost the company about $500 million for acquisitions, licenses, and engineering just to get the first chip out the door.
In other words, sometimes Apple goes to great lengths to assert control over its platforms and customize its products, never mind the expense. Thus far, the strategy has paid off, making Apple one of the world’s largest companies by market capitalization and amassing it $121 billion in surplus cash.
Today, Apple can afford to take risks and go its own way. It would have been characteristic for Jobs to declare his independence from Intel by decreeing that all future Apple products must someday use Apple-designed processors based on Apple’s own ARM CPU cores. Even if it wasn’t an explicit command, his inheritors may be thinking along the same lines, due to their longtime exposure to his reality-distortion field. Having fathered a few successful application processors, they may now believe they can beat Intel at its own game.
Therefore, you can’t rule out that Apple will switch the Mac to ARM or move iOS to the x86. This is not a company that follows the beaten track.
Technically, however, the best bets are that x86 processors will remain the high-performance leaders for desktops, laptops, and servers, while ARM processors will not lose their low-power advantages for mobile devices. Expect the rest of the industry to follow those assumptions until something radically changes. Apple, as it has in the past, could be that radical catalyst.