The Mac at 25: Successes, regrets, Apple’s had a few
By Dan Turner
Remember the adage, “Don’t trust anyone over 30”? Putting aside the fact that you’re likely well past that age if you recognize the line, if you live by it, you may view 32-year-old Apple with a gimlet eye. But the Macintosh itself—which will hit the 25-year mark on January 24—is still something we can trust.
We journalists live for significant anniversaries, which allow us to take a retrospective view of something in the news and look back on events that, in the moment, might not have seemed so momentous. Twenty-five years ago, Apple took the wraps off the first Macintosh, announcing it in a Ridley Scott-directed commercial—a riff on George Orwell’s 1984—that aired nationwide just once, during the 1984 Super Bowl, and went on to become iconic.
In these heady days when Apple seems to be gaining ground in a number of places and ways, it’s important to remember that everything that followed from the first Mac was not a given. If things had gone differently, maybe Microsoft would be the cool, hip upstart now. With that kind of alternate reality in mind, here’s a brief and far-from-complete collection of five successes and five mistakes Apple has made in the last quarter-century.
Let’s look at Apple’s smooth moves:
The Human Interface Guidelines
What did computers look like in 1983? When you turned them on, what did you see?
Odds are, it was a green cursor on a black screen. You had to know how to do what you wanted to do, and then were limited to what you knew how to do—a vicious circle of limitation. Sounds like something Joseph Heller might have come up with, no?
The first Mac, in 1984, was something totally new and different to almost everyone in the computer and noncomputer worlds alike. The windows/icons/mouse/pointer (WIMP) interface, first pioneered at Xerox PARC, was intelligible at a glance and set the paradigm for almost every personal computing interface to follow.
Still, it all could have gone bad (imagine if Windows 3 had been the first UI offered) if not for the coherence and progressive discovery offered by the carefully designed Mac user interface.
That was the result of a lot of work, both theoretical and practical, by Apple’s Human Interface Group on how people looked at and reacted to various parts of an interface. They codified and published the principles and applications of the Mac interface as the Human Interface Guidelines (HIG), showing everything from how to make a button to where the drop shadows should go on screen to how quickly a visual cue should appear after a user click. This wasn’t just a good idea — though it was indeed a very good idea. The public HIG encouraged developers to produce applications that looked and acted like the familiar Mac interface. Users weren’t confused with a whole new way to save, or move, or do anything, each time they loaded a new program. Think that’s a trivial achievement? Take a look at the Interface Hall of Shame. You could be stuck with that.
Bundled with the original Mac were these two breakthrough applications that allowed users to “paint” by clicking and dragging the mouse, and to create and edit text files in a then-new WYSIWYG way. These programs turned every Mac in a store display into an interactive advertisement—Hey, Ma, look what I can do!—something that was more than an experienced programmer could have easily done on previous computers.
These two apps, with toolbars and drop-down menus, set the stage for every application that came after—including ones like Word, which tossed MacWrite into the dustbin of history.
The all-in-one design
The first Mac came in a new shape: a user-friendly, all-contained, all-in-one design (except for keyboard and mouse). It even had a built-in handle on the top for moving it around, and there were carrying bags available for maximum portability, though it was a heavy package to lug around. But heavy or not, it was a lot easier to move around and set up than the cable-fests offered by competitors.
By necessity, Apple moved away from the combined Mac/monitor design, allowing users to pick and replace monitors and easily access expansion slots and the like. Then in 1998, Apple CEO Steve Jobs debuted the iMac. (Personal note: I worked at MacWeek then, and we broke the story the night before.) The “i” was for Internet, remember—and the iMac, with its “there is no Step 3” setup, brought back the “computer for the rest of us” trope for the connected world.
The iMac’s descendents, including the eMac, have been iconic, and among Apple’s best sellers. In fact, the current iMac line has seemed to draw Apple’s focus away from its pro desktops, which haven’t been refreshed in a good, long while.
Nailing the hardware and software transitions
Linux and Unix users like to show off how they can select just the right distro, recompile, read manuals, scour online forums for new hardware configurations and eventually wind up with their operating system of choice running on their hardware of choice. Fun city? Corporations with a large and not-as-technical user base have to provide a smoother path when making hardware or software changes. Microsoft has often shown how hard it can be. Look at broken drivers when moving from Windows XP to Vista, problems with 64-bit software, and the ongoing nightmare of backward compatibility.
And yet Apple has managed to pull it off not once, not twice, but at least four times. They’ve migrated users from 68k to PowerPC to Intel processor architectures—no easy process, each time. Apple managed to make each change seamless in software to users: With each move, there was a transparent emulation technology in place from Day One.
And the change from Mac OS 9 to Mac OS X, while bumpy for a few years, was smoothed by the Classic environment that allowed the reluctant (like me) to live part-time in the new operating system. We still had the OS 9 security blanket, and could work with mission-critical apps that hadn’t been ported to OS X yet.
The iPod, the iPhone and the iTunes Store
The iPod wasn’t the first digital music player, not by a long shot, and when the original model was introduced, some people were skeptical to the point of calling it “lame.” But it grew into an insanely great phenomenon, and may have changed the world. And the iPhone is shaking up the mobile phone market (no matter what John Dvorak and Steve Ballmer say ).
More than just moneymakers, the two have extended Apple’s brand exponentially. Wisely, iTunes was made cross-platform (how well it was ported to Windows is another issue), enabling the iPod to get a hook into more than just Mac users. And both gadgets have a “halo” effect: love your iPod? Check out Apple’s other fine products!
And even in the world of netbooks, the iPhone could be a first step toward truly mobile computing. It’s not a shrunk-down PC—why would you want a Start menu on a tiny cell phone screen?—but something new that could work its way up toward a new and useful paradigm. Already, you can use your iPhone to SSH, exchange (but not edit nor save) files, join WebEx conferences, and all sorts of things hardworking people usually need a computer for. I mean, I’ve longed for a Google implant—this could be the closest thing for the foreseeable future.
And then there were the stumbles
The Apple III
Introduced less than four years before the arrival of the Macintosh, the Apple III was supposed to be the “business” computer to succeed the Apple. It made sense to offer a more powerful, more “serious” computer for the more power-hungry, more serious crowd. And Steve Wozniak, a.k.a. the Woz, a.k.a. the other founder of Apple, was in on the design. However, it didn’t come together—literally, in a lot of cases. The circuit board was tightly packed, causing short circuits. One technical bulletin told users to pick up their Apple III and drop it a few inches to reseat chips. And Jobs demanded there be no fan, which caused heat-related problems in the hardware. (Jobs continues to push that anti-fan agenda to this day; maybe he hates the sound.) Other software problems, a high price and problematic backward compatibility with Apple II software all made this a big failure, and Apple’s rep in the business world was pretty well damaged.
The Perfomas—oh God, the Performas
After Apple disgorged Steve Jobs and brought in ex-soda CEO John Sculley, the latter got the idea to spew out many SKUs of Macs. This was the Perfoma line, designed to be less intimidating than the Mac itself (intimidating?), but the sheer landslide of barely distinguishable models was intimidating enough. With the same basic hardware, there were educational models, direct sales models, models for sale at a mass-market retailer … each software bundle might be a little different from the others, but who could keep track?
Also, it didn’t help that most Perfomas were, well, crap. The quality ranged from not so great to awful. For example, the 4400—a “fat pizza box” desktop — was supposed to be targeted at casual business users, but it was so poorly built that peripherals would suddenly not work, hardware glitches would cause hard crashes, and so on.
Needless to say, this adversely affected Apple’s image of providing high-quality products. Soon after his second coming, Jobs made quality Job 1, or something like that. He also quickly stripped down the product matrix: one consumer laptop, one pro laptop, one consumer desktop, one pro desktop. There is no Step 5.
The cloning vats
From 1995 to 1998, Apple tried something new: licensing. It’s been an article of faith among the “Apple will die … any day now” crowd that the company made a fatal mistake in restricting Mac operating system use to actual Macs. The idea was that if Apple became just an operating system vendor, like Microsoft, it would grow like topsy, like Microsoft.
Shaky logic there, but Apple tried it. Starting in the non-Jobs era, I should add.
The strategy did have some salutary effects, not the least of which was enabling Power Computing’s awesome ad campaigns. Daystar Digital experimented with then-unusual multiprocessor configurations; some lower-priced Mac clones hit the market; some companies pushed the build-to-order and direct sales models; and Power Computing armed itself with ex-Apple engineers to push a few technical boundaries.
But the hardware licensing agreements were awkward, shortsighted and restrictive — Apple never quite seemed to commit. And the third-party developers, without strong Apple support and with high licensing costs, couldn’t find a way to offer products significantly different from Apple’s own. After Jobs’ return, he decided to end the licensing experiment. His rationale: It had begun too late to really make a difference, and the clones were cutting into Apple’s own sales instead of expanding the market. In late 1997, the whole shebang wound down, with Apple buying some of Power Computing’s assets for $100 million.
Some of us still have a few posters, though.
Not the best .edu sales structures
The Apple II had the closest thing to a lock on the educational market, in its day. Expensive, but solid and easy to manage, along with a great library of apps kids could use. Since then, Apple has still been strong in the educational market, but not as strong as it could be.
Certainly, some of the reasons for this were out of Apple’s control. After Windows 95 was introduced—“Start me up … you make a grown man cry”—Microsoft used its hefty connections and cash to donate tons of Windows PCs to secondary schools and higher education depots. This not only trained future Windows users, but it allowed Microsoft to write off the retail costs.
Apple tended to coast in terms of pushing education sales. The company has reorganized its educational sales force many times, but it has never made the kind of push vendors like Dell or IBM did. For example, when a friend of mine, who ran computer systems at a major university, tried to price out a new computer lab, Dell, HP and others not only gave him bulk pricing, they also threw in same-day support, as well as teams of techs to come and install and configure the whole lab. Apple’s response? “The closest Apple store to you is … you can buy what you need from there.” At retail.
Clunky and weird online strategies
In the days of online walled gardens such as AOL, CompuServe and the like, Apple tried its own, calling it eWorld. Remember that? Nah, not many people do.
Since then, Apple has thrown up iTools, only to abandon it. There was .Mac, which promised sort of a “cloud” experience but cost $99 a year and never seemed to go very far. That was recently rebranded MobileMe, with more of a Web 2.0 feel, but it instantly suffered from outages. And it still cost, and still didn’t work quite right.
Granted, Microsoft’s Live initiative hasn’t rocked the world either, but it’s clear that online efforts aren’t Apple’s core strength, and poor services are more damaging to the brand than no services.
So there you have it. Highlights and low points from the last 25 years. Of course, there are lots more in each category. You don’t live past 30 without acquiring piles of little victories and regrets. Have your own top picks? Weigh in on the comments section.
[Dan Turner has been writing about science and technology for over a decade at publications including Salon, eWeek, MacWeek and The New York Times.]