In a 1965 speech, computer scientist Gordon Moore, who shortly thereafter became a founder of Intel, predicted that personal-computer speeds would double every year. That impressive observation, soon dubbed Moore’s Law, surprised millions of people, prepared the PC-buying world for the “whatever you buy today will be obsolete next year” syndrome that dogs us to this day, and became a favorite of journalists everywhere. Writers trot it out to justify whatever argument they happen to be making, from “Buy the stuff reviewed in our magazine” to “Apple is dead.”
Get It Right
But believe it or not, Moore’s Law is malarkey. First of all, we didn’t even get the quote right. Moore wasn’t talking about computer speed doubling at all; he was talking about the number of transistors on a typical chip, which isn’t necessarily related. (For that matter, he didn’t actually say that the doubling would take place every year; he really said “every 18 to 24 months.”) Second, even if we had understood it right, Moore’s Law wouldn’t be accurate anyway. The first Mac, in 1984, ran at 8MHz; if that speed had doubled every 18 months, we would now be scooting along on 8,192MHz PowerPC chips that would melt right through our desks and straight on down to China.
Instead, the state of the Macintosh art today is a 400MHz PowerPC G3 chip. Now, the following may shock you, so I hope you’re sitting down: I think that today’s chips are fast enough. I’m not one of these people who scans the Mac rumor Web sites every week looking for news of the G4, G12, or G28 chip; my Mac scrolls, displays graphics, and totals spreadsheets instantaneously. I can’t imagine why anyone would want to pay more for a faster chip.
I’m not saying that I think Macs are fast enoughI’m saying that the chips are fast enough. There’s a big difference. For example, ten years ago, today’s Macs would have been considered unimaginably quick. (Remember the Mac IIx, ten times slower than today’s Power Macs? This very magazine called it “wicked fast.”) And yet, consider how many times every day you wind up just sitting there waiting for something to happen: every time you launch a program, turn file sharing on, switch your AppleTalk connection, dial the Internet, wake up your computer from sleep, orthis is the big onestart up the computer. These and many other bottlenecks make a mockery of our hopes that faster processors will help us get our work done sooner.
Fortunately, as the computer industry heads into the year 2000, a few bright minds have begun to discover that software can compensate for such bottlenecks. These clever programmers are making it clear that to get faster Macs, we don’t have to wait, like sitting ducks, for the next generation of PowerPC chips.
For example, programs like SpeedStartup (Casady & Greene) and StartupDoubler (shareware) memorize your extensions, shaving 30 seconds or more off every start-up. The iBook’s new Save And Shut Down command is equally brilliant: according to Apple’s prerelease documents, it memorizes the current status of all your open programs. When you turn on the iBook again later, you’re taken directly back to whatever you were doing, bypassing the entire start-up and document-opening sequences. Any program (such as Intuit’s Quicken and Palm Desktop) that attempts to autocomplete your typing is also a godsend.
Also doing remarkable work in SpeedSmart Software is, of all companies, Microsoft. Because the company’s gigantic applications may never actually run quickly, they compensate by saving us time in other ways. Classic example: a single click on the glorious AutoFill icon on the Internet Explorer tool bar tells the application to fill your name, address, phone number, and other repetitive information into the blanks on any Web page. Similarly, when you save a document, Word proposes naming the file after the first line of the document (“November Meeting Agenda,” for example) instead of “Untitled.” You save ten seconds each time it guesses right.
Only if such not-so-artificial intelligence blossoms in our everyday software (and in the Mac OS) will we ever catch up to the speed gains promised by Moore’s So-Called Law. Otherwise, we’ll continue to suffer from the effect described by Pogue’s Law: any extra speed introduced by faster chips is soon offset by increasingly bloated software.
DAVID POGUE (
) is the author of the upcoming The iBook for Dummies
(IDG Books Worldwide, 1999).
November 1999 page: 198