The Mac at 25: Successes, regrets, Apple’s had a few
Remember the adage, “Don’t trust anyone over 30”? Putting aside the fact that you’re likely well past that age if you recognize the line, if you live by it, you may view 32-year-old Apple with a gimlet eye. But the Macintosh itself—which will hit the 25-year mark on January 24—is still something we can trust.
We journalists live for significant anniversaries, which allow us to take a retrospective view of something in the news and look back on events that, in the moment, might not have seemed so momentous. Twenty-five years ago, Apple took the wraps off the first Macintosh, announcing it in a Ridley Scott-directed commercial—a riff on George Orwell’s 1984—that aired nationwide just once, during the 1984 Super Bowl, and went on to become iconic.
In these heady days when Apple seems to be gaining ground in a number of places and ways, it’s important to remember that everything that followed from the first Mac was not a given. If things had gone differently, maybe Microsoft would be the cool, hip upstart now. With that kind of alternate reality in mind, here’s a brief and far-from-complete collection of five successes and five mistakes Apple has made in the last quarter-century.
Let’s look at Apple’s smooth moves:
The Human Interface Guidelines
What did computers look like in 1983? When you turned them on, what did you see?
Odds are, it was a green cursor on a black screen. You had to know how to do what you wanted to do, and then were limited to what you knew how to do—a vicious circle of limitation. Sounds like something Joseph Heller might have come up with, no?
The first Mac, in 1984, was something totally new and different to almost everyone in the computer and noncomputer worlds alike. The windows/icons/mouse/pointer (WIMP) interface, first pioneered at Xerox PARC, was intelligible at a glance and set the paradigm for almost every personal computing interface to follow.
Still, it all could have gone bad (imagine if Windows 3 had been the first UI offered) if not for the coherence and progressive discovery offered by the carefully designed Mac user interface.
That was the result of a lot of work, both theoretical and practical, by Apple’s Human Interface Group on how people looked at and reacted to various parts of an interface. They codified and published the principles and applications of the Mac interface as the Human Interface Guidelines (HIG), showing everything from how to make a button to where the drop shadows should go on screen to how quickly a visual cue should appear after a user click. This wasn’t just a good idea — though it was indeed a very good idea. The public HIG encouraged developers to produce applications that looked and acted like the familiar Mac interface. Users weren’t confused with a whole new way to save, or move, or do anything, each time they loaded a new program. Think that’s a trivial achievement? Take a look at the Interface Hall of Shame. You could be stuck with that.
Of course, things change, and there have been blips along the way—especially when Apple moved to Mac OS X.
Bundled with the original Mac were these two breakthrough applications that allowed users to “paint” by clicking and dragging the mouse, and to create and edit text files in a then-new WYSIWYG way. These programs turned every Mac in a store display into an interactive advertisement—Hey, Ma, look what I can do!—something that was more than an experienced programmer could have easily done on previous computers.
These two apps, with toolbars and drop-down menus, set the stage for every application that came after—including ones like Word, which tossed MacWrite into the dustbin of history.
The all-in-one design
The first Mac came in a new shape: a user-friendly, all-contained, all-in-one design (except for keyboard and mouse). It even had a built-in handle on the top for moving it around, and there were carrying bags available for maximum portability, though it was a heavy package to lug around. But heavy or not, it was a lot easier to move around and set up than the cable-fests offered by competitors.
By necessity, Apple moved away from the combined Mac/monitor design, allowing users to pick and replace monitors and easily access expansion slots and the like. Then in 1998, Apple CEO Steve Jobs debuted the iMac. (Personal note: I worked at MacWeek then, and we broke the story the night before.) The “i” was for Internet, remember—and the iMac, with its “there is no Step 3” setup, brought back the “computer for the rest of us” trope for the connected world.
The iMac’s descendents, including the eMac, have been iconic, and among Apple’s best sellers. In fact, the current iMac line has seemed to draw Apple’s focus away from its pro desktops, which haven’t been refreshed in a good, long while.
Nailing the hardware and software transitions
Linux and Unix users like to show off how they can select just the right distro, recompile, read manuals, scour online forums for new hardware configurations and eventually wind up with their operating system of choice running on their hardware of choice. Fun city? Corporations with a large and not-as-technical user base have to provide a smoother path when making hardware or software changes. Microsoft has often shown how hard it can be. Look at broken drivers when moving from Windows XP to Vista, problems with 64-bit software, and the ongoing nightmare of backward compatibility.
And yet Apple has managed to pull it off not once, not twice, but at least four times. They’ve migrated users from 68k to PowerPC to Intel processor architectures—no easy process, each time. Apple managed to make each change seamless in software to users: With each move, there was a transparent emulation technology in place from Day One.
And the change from Mac OS 9 to Mac OS X, while bumpy for a few years, was smoothed by the Classic environment that allowed the reluctant (like me) to live part-time in the new operating system. We still had the OS 9 security blanket, and could work with mission-critical apps that hadn’t been ported to OS X yet.
The iPod, the iPhone and the iTunes Store
The iPod wasn’t the first digital music player, not by a long shot, and when the original model was introduced, some people were skeptical to the point of calling it “lame.” But it grew into an insanely great phenomenon, and may have changed the world. And the iPhone is shaking up the mobile phone market (no matter what John Dvorak and Steve Ballmer say ).
More than just moneymakers, the two have extended Apple’s brand exponentially. Wisely, iTunes was made cross-platform (how well it was ported to Windows is another issue), enabling the iPod to get a hook into more than just Mac users. And both gadgets have a “halo” effect: love your iPod? Check out Apple’s other fine products!
And even in the world of netbooks, the iPhone could be a first step toward truly mobile computing. It’s not a shrunk-down PC—why would you want a Start menu on a tiny cell phone screen?—but something new that could work its way up toward a new and useful paradigm. Already, you can use your iPhone to SSH, exchange (but not edit nor save) files, join WebEx conferences, and all sorts of things hardworking people usually need a computer for. I mean, I’ve longed for a Google implant—this could be the closest thing for the foreseeable future.
The Mac at 25: Successes, regrets,...Next Page