Steve Jobs died Wednesday after battling cancer and related conditions for seven years. He was 56. Jobs, who reigned as Apple CEO for 14 years,
resigned his post in August 2011 and was replaced by Tim Cook, who previously was the company’s Chief Operating Officer. Jobs, in turn, was elected as chairman of Apple’s board of directors.
Both as the founder of the first successful personal-computer company and as the man who transformed a nearly-bankrupt Apple into one of the most successful companies on the planet, Jobs established himself as an American icon of business and technology.
Apple: The early years
If Steve Jobs had never returned to Apple after 1985, he’d still be remembered for the Macintosh.
Jobs didn’t create the Mac project—it was started by Jef Raskin in 1979—but he took it over in 1981 and brought it to fruition. Jobs didn’t write the code or design the circuit boards, but he was the one who provided the vision that made it all happen. As original Mac team member Andy Hertzfeld wrote, “Steve already gets a lot of credit for being the driving force behind the Macintosh, but in my opinion, it’s very well deserved … the Macintosh never would have happened without him.”
Apple’s introduction of the Macintosh in 1984 introduced the graphical user interface to mainstream desktop computing. The Mac ran on a 32-bit processor (compared to 16-bit processors for other PCs at the time) and had 128K of memory. It was an immediate success: more than 400,000 Macintosh computers were sold in the first year.
The Mac’s impact wasn’t just felt on people who bought it in the ’80s, though: in hindsight, it quite literally redefined what a computer was. Microsoft introduced its Windows program as a reaction to it; by 1995 Windows had duplicated Apple’s graphical interface. Essentially every personal computer in existence now follows most of the paradigms introduced by the original Mac more than a quarter-century ago.
The Mac capped off a series of accomplishments for Jobs in the early days of Apple, which he co-founded in 1976 with Steve Wozniak and Ronald Wayne. The company famously started in Jobs’s garage, where the company assembled its first computer, the Apple I. Its first mass-produced product was the Apple II, which was released in 1977. Designed by Wozniak, the Apple II featured a rugged plastic case, an integrated keyboard and power supply, support for color displays, and a 5.25-inch floppy drive. The Apple II was a wild success, ushering in the personal computer era, and carried Apple through the mid-1980s.
In the early ’80s Apple tried to build on its success with an Apple III targeted at business users, but it was a resounding failure. The story goes that Steve Jobs wanted the computer to run silently—a good example of Jobs’s attention to product detail—so he ordered that it be built without an internal fan. Unfortunately, customers found that the Apple III overheated frequently.
At the end of 1980, Apple went public; its IPO created hundreds of millionaires at the company. In exchange for $1 million of pre-IPO stock, Xerox gave Apple access to its PARC facilities, where Jobs and others saw the progress Xerox was making with the graphical user interface (GUI). That visit led to the Apple Lisa—a Mac-like computer that sold for nearly $10,000, and was never a success—and then the Mac.
Jobs was also a driving force behind the famous “1984” television commercial, directed by Ridley Scott, that debuted during the Super Bowl in January 1984. Jobs and his personally-recruited CEO John Sculley thought the iconic ad was excellent, and purchased 90 seconds of Super Bowl commercial time for the spot. Apple’s board of directors was less convinced of the advertisement’s greatness, and Apple’s advertising agency Chiat/Day resold 30 of those seconds to another advertiser. The ad ran, and the Macintosh went on sale two days later.
Eventually, the Macintosh’s increasingly sluggish sales performance strained the relationship between Jobs and Sculley. Sculley favored introducing more IBM compatibility; Jobs was opposed. Jobs and Sculley each went before Apple’s board and lobbied for the other’s removal. Eventually, on May 31, 1985, Apple announced that—following its first-ever quarterly loss and a round of layoffs—Steve Jobs was leaving the company he’d co-founded. He left with a net worth of $150 million and started his next venture, Next.
In a commencement speech at Stanford University in 2005, Jobs said that his firing from Apple in the mid-1980s “was the best thing that could have ever happened to me.” That may have been true for Jobs, who used his time away from Cupertino to not only found Next but also buy a fledgling animation studio that would become Pixar, but Apple racked up more than its share of stumbles. Under several post-Jobs CEOs, Apple tried repeatedly—and failed repeatedly—to release an updated successor to the aging Macintosh operating system. Taligent was the future. Then Copland—”Mac OS 8”—was hyped as the new direction for the OS, only to be abandoned and replaced with an incremental update to the original Mac OS.
In late 1996, Apple CEO Gil Amelio announced that the company would acquire Next for $400 million. That deal brought Steve Jobs back to Apple, initially as an advisor to Amelio. At the time, Apple declared “the advanced technical underpinnings and rapid development environment of [what became Mac OS X] will allow developers to create new applications that leapfrog those of other ‘modern’ operating systems, such as Windows NT.”
Apple was right—Next’s operating system became the basis for Mac OS X—but it’s unlikely that Amelio predicted precisely how the acquisition would play out. In July of 1997, Apple’s board of directors voted to remove Amelio from his post, naming Jobs the company’s interim CEO.
That move kicked off an era of increasing—and, to date, unceasing—success for Apple and Jobs. In Jobs’s August 1997 Macworld Expo keynote, Apple announced that it was ending the licensing program that allowed other companies to sell Mac-compatible computers and that Microsoft had invested $150 million in the company. Both controversial moves paid off.
A year later, Steve Jobs unveiled the product that perhaps singularly kicked off Apple’s rebound: the original iMac. Jobs had asked designer Jonathan Ive—whom he’d eventually promote to the role of senior vice president of industrial design—to create a colorful, easy-to-set-up, all-in-one computer. The result was a new Mac with a unique look that startled the industry. Its bold color, lack of a floppy drive, and embrace of the new USB connectivity standard were all considered shockers at the time; consumers, however, were delighted. Apple sold 800,000 iMacs in fewer than five months. The floppy faded into history and USB became a roaring success. The iMac, and the Jobs/Ive partnership, cemented Apple’s stance that its insanely great products needed to look the part.
In March 2001, Apple released the first iteration of Mac OS X after a public beta that began in late 2000. The operating system was based on NextStep, the Unix-based OS devised by Jobs’s team at Next. Though it was named as a simple sequel to OS 9, OS X had an entirely new codebase and marked a dramatic new beginning. Jobs had overseen a massive effort at Apple to create native, Unix-based ports of the original Macintosh APIs—programming hooks upon which Mac developers relied, in a system called Carbon. That meant that developers could, with some exceptions, make their software compatible with OS X merely by recompiling it, without needing to rewrite the software from scratch. And applications that weren’t updated for OS X could take advantage of the integrated Classic environment to run OS 9 apps within OS X—making the transition from OS 9 to OS X significantly less painful than many people expected it to be. OS X was a towering achievement for Jobs and Apple, and a welcome respite from the years of promised but unrealized OS upgrades from Cupertino.
Jobs oversaw other massive software undertakings around this time, too. In 1998, the company’s QuickTime authoring standard was being threatened in the digital video editing space by Microsoft’s Advanced Authoring Format; Avid and Adobe had both moved away from the format, and only Macromedia’s KeyGrip software—which had recently been rebranded “Final Cut”—still incorporated it. But Final Cut had been ignored and delayed by the Macromedia higher-ups in favor of development on its Flash software, and its future was thus largely uncertain.
Something had to be done to combat these issues. That solution, as overseen by Jobs, was to buy Final Cut. The company used it to accelerate development on the QuickTime standard, releasing the first Apple-branded version, Final Cut Pro, at 1999’s National Association of Broacasters show. Final Cut Pro 1.0 was designed to provide editors interested in the non-linear space a simpler, low-cost way to get into the business—and to ensure that QuickTime would not go the way of some of Apple’s lost software technologies.
Jobs was refining Apple’s message: The company made the computer you used to create, to explore, to “think different.” And as a direct result of the company’s investment into high-end non-linear editing software, Apple could explore a new area—consumer-level editing.
Similarly, one of the most significant consumer-level Apple products to emerge at this time wasn’t hardware, but software: iLife. The company was ahead of the rest of the industry in realizing that digital media—music, videos, and photos—would soon become central to people’s lives. In 1999, Apple released iMovie (and shipped it with a new iMac DV, for Digital Video), a program designed to let even the most-novice of computer users download video from their video camera and easily turn it into high-quality movies, complete with transitions, titles, and effects.
That was followed, in 2001, by iTunes (which debuted early in the year but became much more significant with the fall debut of the iPod) and iDVD, the latter of which let home-video takers create standard DVDs of their movies, including menus, themes, chapters, and slideshows. And 2002 brought the debut of iPhoto, which similarly made it easy to download and organize photos from digital cameras. By 2003, Apple had improved these programs’ integration with each other and rolled them into a single package, iLife, that shipped with every Mac.
The impact of iLife is often overlooked: It meant that at a time when digital media was ascendant, and Apple was trying to differentiate its hardware from the competition, every Mac included a suite of great, easy-to-use software that let people create and manage that media—something that wasn’t true of any other computer on the market at the time.
“We don’t think the PC is dying at all,” Jobs said during his 2001 Macworld Expo keynote where he discussed Apple’s digital hub strategy. “It’s evolving.”
Apple’s retail strategy evolved as well. In 2001, the company opened up its first retail stores, at a time when other PC makers—most notably Gateway—were stumbling with brick-and-mortar outlets. A decade later, Apple now operates more than 300 stores around the globe. The stores first turned a profit in 2004; last year, they recorded $9 billion in retail sales with $2.4 billion in retail profit. More significant, as Apple likes to point out in its quarterly earnings report, 50 percent of the people buying computers at the Apple Store are first-time Mac customers.
“People just don’t want to buy personal computers any more,” Jobs said in a 2001 video introducing the stores and their philosophy. “They want to know what they can do with them. And we’re going to show to them exactly that.”
Four years after the introduction of OS X, Jobs and Apple instituted another transition—this one away from the PowerPC architecture to chips built by Intel. It was a big gamble for a company that had relied on PowerPC processors since 1994, but Jobs argued that it was a move Apple had to make to keep its computers ahead of the competition. “As we look ahead… we may have great products right now, and we’ve got some great PowerPC product[s] still yet to come,” Jobs told the audience at the 2005 Worldwide Developers Conference. “[But] we can envision some amazing products we want to build for you and we don’t know how to build them with the future PowerPC road map.”
The transition went much faster—and much smoother—than anyone, including Apple, had anticipated, thanks in large part to Rosetta. The dynamic translator let applications designed for PowerPC systems run on Intel-based Macs, giving developers time to revamp their products for Apple’s Intel-based future. In fact, PowerPC apps only became obsolete this summer when Apple retired Rosetta with the introduction of Mac OS X Lion.
Beyond the Mac
Of course, the assorted transitions during Jobs’s reign as CEO weren’t confined to the Mac. Perhaps the greatest transition Jobs initiated was moving Apple away from being just a software and computer maker and into the lucrative world of consumer electronics. The shift became official in 2007 when Apple dropped the word “Computer” from its name, simply calling itself Apple Inc.
The shift began with the iPod. When Apple unveiled its music player in the fall of 2001, the market for MP3 players was in its early stages. Devices at the time relied on small amounts of flash memory that could hold only a handful of songs. In short, it was a field that was ripe for innovation—and innovate Apple did with the iPod. The device’s 5GB capacity gave it the storage space to, in Apple’s words, “put 1000 songs in your pocket.” And while not the first hard-drive-based digital music player on the market—Creative’s Nomad series beat it to the punch—the iPod had something going for it that no other company could match: software integration. Though iTunes debuted earlier in 2001, it was with the iPod’s fall introduction that the pieces clicked into place and Apple’s ecosystem started to take shape.
Still, at the time, the iPod met with heavy skepticism. Why was Apple, a computer company, making a portable music player? “We love music,” Jobs said during the iPod’s introduction. “And it’s always good to do something you love.”
It proved to be lucrative for Apple, too. The company has sold hundreds of millions of iPods in the last decade, and though sales growth slowed and then declined in recent years, Apple continues to enjoy a 70 percent share of the MP3 player market. Part of the reason for the device’s success? Apple’s repeated willingness to reinvent the iPod line. Take 2005’s decision to kill off the popular iPod mini and replace it with the smaller, flash0based iPod nano. That kind of thinking, utterly foreign to most companies, was second nature to Steve Jobs: Why not kill a product at the height of its popularity if you’re going to replace it with something even better?
Steve Jobs seemed to anticipate the demand for the iPod from the get-go: “Music’s a part of everyone’s life,” Jobs said at the 2001 launch event. “Music’s been around forever. This is not a speculative market. And because it’s a part of everyone’s life, it’s a very large target market all around the world.”
As it did with the iPod, Apple didn’t create a new product category with 2007’s iPhone introduction. Smartphones existed before Apple came out with its effort, with existing devices aimed largely at business customers who wanted to check their email when they were out and about. Apple instead set its sights on the broader consumer market. It would appeal to the end user by informing its device with the same sensibilities it had used in the Mac: good design, ease of use, and a harmonious marriage between software and hardware.
“Every once in a while a revolutionary product comes along that changes everything,” Jobs said at the 2007 Macworld Expo keynote when he pulled the first iPhone out of his pants pocket. “One is very fortunate if you get to work on just one of these in your career. Apple’s been very fortunate. It’s been able to introduce a few of these into the world.”
That may sound like the kind of “reality distortion field”-style hype that Jobs became famous for—and to some extent, it is. But it also happens to be true. Look no further than how other smartphone makers responded—with devices that mirrored the iPhone’s touch-screen controls, powerful Web browser, and array of third-party mobile apps. Where once every smartphone had to have a physical keyboard, many now rely upon just a touchscreen; that’s a direct result of the iPhone’s influence.
Jobs closes out his tenure as Apple’s CEO by leading the company into what’s being billed as the “post-PC” era—a period in which mobile devices no longer need sync up with computers. It was with that vision in mind that Apple rolled out the iPad, which brings PC-style computing into a handheld device. Launched less than two years ago, the iPad has already carved out a new market for tablet computing, with other companies once again trying to keep pace with Apple. It also joins the original Mac, the iPod, and the iPhone among the revolutionary products Jobs helped develop during his Apple career.
Jobs was diagnosed with pancreatic cancer in 2004. After surgery he returned to Apple, but had to take another leave of absence in 2009, ultimately
undergoing a liver transplant. He took his
final leave of absence in January 2011.
In August, he formally resigned as CEO. “I have always said if there ever came a day when I could no longer meet my duties and expectations as Apple’s CEO, I would be the first to let you know. Unfortunately, that day has come,” Jobs said in a letter addressed “to the Apple Board of Directors and the Apple Community.”
“I believe Apple’s brightest and most innovative days are ahead of it. And I look forward to watching and contributing to its success in a new role,” Jobs wrote. “I have made some of the best friends of my life at Apple, and I thank you all for the many years of being able to work alongside you.”
It would be a mistake to characterize Jobs’s time at Apple simply by the products the company released. Those products came about because of principles held by Jobs that he made sure were shared by others at Apple, especially as he refashioned the company following his 1997 return to Cupertino.
The products mentioned throughout this story might not have come to pass were it not for Apple’s constant need to innovate. That’s an attitude driven by Jobs, during flush times as well as well as when the tech business was less than booming. It’s worth noting that some of Apple’s biggest product releases during Jobs’s tenure—the iPod and the iPad, most notably—were developed during recessions when consumers theoretically were less inclined to spend money on pricey electronics.
“The way we’re going to survive is to innovate our way out of this,” Jobs told Time Magazine in early 2002, a strategy the company returned to when the economy went south again in 2008. In both instances, Apple under Jobs upped its research-and-development spending, helping the company produce a strong product lineup that could weather tough times.
It goes without saying that under Jobs, Apple became synonymous with great design. From the early days of the Macintosh, when Jobs agitated for rectangles with rounded corners, no aspect of the design process escaped the company’s attention.
But Jobs was about more than design just for the sake of looking good—the design decisions Apple makes also take usability into account. That 2002 Time Magazine article recounts the creation of the first flat-panel iMac and how Jobs scrapped an early version of the desktop because its design failed to impress. Time’s Josh Quittner recounted the subsequent meeting between Jobs and Apple executive Jonathan Ive:
That’s an approach to creating products that sticks with other Apple employees, even after they leave the company. “You almost imagine that Steve is in your office,” Flipboard founder and ex-Apple engineer Evan Doll told the San Francisco Chronicle. “You say to yourself, what would he say about this? When you’re kicking around an idea for a product, or for a feature, you’ll even say it in discussion—’Steve Jobs would love this!’ or, more often, ‘Steve Jobs would say this isn’t good enough.’ He’s like the conscience sitting on your shoulder.”