Forty years after he coined the most famous law in computing, Gordon Moore still has a few words of advice for the industry.
For software developers: Simplify! Your interfaces are getting worse. Nanotechnology? Don’t believe the hype; silicon chips are here to stay. Artificial intelligence? Try again, folks! You’re barking up the wrong tree.
Speaking by telephone from Hawaii, where he now lives, Moore fielded an hour of questions from reporters Wednesday to mark the approaching 40th anniversary of his celebrated prediction — that the number of transistors on integrated circuits would double roughly every two years.
Christened later as Moore’s Law, his observation became something of a self-fulfilling prophecy for the industry, he said, driving computer makers to keep pace with the expected rate of advancement. But he was too humble to claim credit for a phenomenon that effectively made possible the rapid evolution of modern electronics.
“If I hadn’t published this paper in ’65, the trends would have been obvious a decade later anyway. I don’t think a particular paper made a difference. I was just in a position where I could see the trend,” he said.
Moore, now 76, was director of research and development at Fairchild Semiconductor when his paper was published in Electronics Magazine on April 19, 1965. Three years later he founded Intel Corp. with Robert Noyce, becoming its chief executive officer (CEO) in 1975 and chairman four years after that.
His law had little effect at first, he said. The first big impact he recalls is when Japanese manufacturers entered the memory chip business in the 1970s. For a while, the Japanese struggled to find their step in a business where the technology seemed to advance in an unpredictable fashion.
“Once they saw the memory series developing — from 1K, to 4K, to 16K — they had a method by which to plan where the industry would end up, and they were very successful at intersecting the trajectory and taking a leading position,” he said.
Moore reread his paper about a year ago, he said, and was pleasantly surprised to find that it also foresaw the use of computers at home, although he had forgotten he made that prediction by the time the first home computer appeared. In fact, as CEO of Intel years later, he would dismiss home computing altogether.
“An engineer came to me with an idea about a home computer,” he recalled. “I said, ‘Gee, that’s fine but what would you use it for?’ He could only think of a housewife using it to keep recipes on. I didn’t think that would be a very powerful application, so I didn’t think Intel should pursue a personal computer at that time.”
In general, the computing industry has done “a pretty good job” over the years, he said. But he singled out software interfaces — and by implication Microsoft Corp., which has dominated PC software for decades — for particular criticism. By cramming ever more features into applications, software makers may actually be moving backward, not forward, he said.
“As people make improvements in the interface, the complexity seems to grow, and I think if anything we’re losing ground a bit in general purpose computing,” Moore said. “They want to offer so many new functions in applications, it’s difficult to simplify everything at the same time.”
Regarding nanotechnology, he is “a skeptic,” he said, and has little faith in it replacing silicon-based integrated circuits for mainstream use any time soon.
“There’s a big difference between making one tiny transistor and connecting a billion of them together to do a useful function,” he said. “That’s something I think people often overlook.”
Far from being outdated, the integrated circuit is spreading into new fields, such as gene chips for disease analysis, airbag sensors and “microfluidics,” which he described as a tiny chemistry lab on a chip. In a sense, he noted, silicon chips have become nanotechnology, since they include features smaller than 100 nanometers, a popular measure for what constitutes nanoscience.
Asked about artificial intelligence, he said computers as they are built today will not come close to replicating the human mind because they were designed from the outset to handle information in a different way. Scientists need to figure out more clearly how the mind works, and then build a computer from scratch to mimic it.
“I think computers are actually going in the wrong direction” when it comes to replicating human intelligence, he said.
Still, they may mimic parts of human intelligence, such as the ability to recognize language and distinguish, for example, between when a person is saying “two” or “too.”
“I think when it recognizes language that well, then you can start to have an intelligent conversation with your computer and that will change the way you use computers dramatically,” he said. That level of intellect may be anything from 10 to 50 years away, he added.
He’s excited about the future of computing, which will bring “mind boggling” developments, he said. “I sure wish I could be around in 40 years to see what happens,” he said.
Asked to come up with a new law that might carry the industry forward for another 40 years, Moore declined. He acknowledged several times that he is no longer as close to modern computing as he once was.
“I think I’ll rest on my laurels on this one,” he said.