Has Moore’s Law generated Moore’s Era?
DOI: 10.1063/PT.5.8112
“It’s amazing how often I run across a reference to Moore’s Law,” says
The Wall Street Journal celebrated it with an 18 April front-page report
Silicon Valley pioneer Gordon Moore laid out a bold theorem 50 years ago. Engineers would cram twice as many transistors on tiny squares of silicon every year or so, producing more and more power in ever-smaller machines.
His extrapolation, known as Moore’s Law, has been one of the most enduring precepts of the technology industry, foretelling the revolutionary emergence of personal computers, mobile phones, Web servers and network routers. Each generation of chips usually brought more performance at a lower cost.
The WSJ‘s news report quickly turns from celebration to reconsideration, explaining that although 14-nanometer-scale circuitry for the latest chips means squeezing hundreds of millions more transistors onto a chip, it also means large cost increases. The article ends, though, by citing “dramatic steps” that some makers of data-storage chips are taking:
Producers of chips called NAND flash memory used in smartphones and an increasing number of computers have decided to stop shrinking transistors, worried that smaller circuitry won’t store data reliably.
Instead, they plan to stack circuits in three dimensions—32 or 48 layers per chip—rather than on a flat square of silicon to keep boosting the capacity of their devices.
Micron and Intel expect to produce so-called 3-D NAND chips that initially store as much as 384 gigabits of data, or three times more than conventional memory chips.
Later this year, Intel expects to deliver a chip for specialized applications with eight billion transistors—or 133 million times more than chips than when Mr. Moore made his projection.
Meanwhile on the opinion page, WSJ technology writer Michael S. Malone celebrates Moore’s Law under the headline “The promise at technology’s powerful heart,” with the thumbnail summary “As Moore’s Law turns 50, the revolution in computing it foretold is on the cusp of even more-radical progress.” Malone sees Moore’s Law as “the heartbeat of the modern world” and declares that it “seems more likely than ever that a thousand years from now, what will be remembered most about our time will be its stunning efflorescence of innovation and entrepreneurship. By then Moore’s Law will have become Moore’s Era.”
Often the celebratory thrust of the reporting relies on catchy observations. USA Today reported
A short article
Supratik Guha, director of physical sciences at IBM, contributed the Fortune commentary. After suggesting that “tell-tale signs” show that Moore’s Law is slowing and after declaring, “we are almost certain that the law will cease to hold within a decade,” Guha writes:
With further miniaturization silicon transistors will attain dimensions of the order of only a handful of atoms and the laws of physics dictate that the transistors and electronic circuits will cease to work efficiently at that point. As Moore’s Law’s slows down, innovations in other areas, such as developments in software, will pick up the slack in the short term.
But in the longer term, there will be fundamental changes in the essential design of the classical computer that, remarkably, has remained unchanged since the 1950s. Designed for precise calculations, today’s computing machines do not make inferences, and qualitative decisions, or recognize patterns from large amounts of data efficiently. The next substantive leap forward will be in computers with human-like cognitive capabilities that are also energy efficient. IBM’s Watson, the computing system that won the television game show Jeopardy! in 2011, consumed about 4000 times more energy than its human competitors. This experience reinforced the need for new energy efficient computing machines that are designed differently from the sequential, calculative methodology of classical computers and are inspired, perhaps, by the way biological brains work.
The Washington Post‘s Samuelson sums up this way:
Moore’s Law is a quiet rebuke to those who think we control our destiny. The historical reality is that technological, commercial and intellectual upheavals—often unforeseen—set in motion forces that create new opportunities and threats. We struggle to master what we haven’t anticipated and often don’t understand. The explosion of computing power imagined by Moore is one of those spontaneous transformations that define and dominate our era.
Los Angeles Times columnist Michael Hiltzik is both sunnier and briefer in summing up
---
Steven T. Corneliussen, a media analyst for the American Institute of Physics, monitors three national newspapers, the weeklies Nature and Science, and occasionally other publications. He has published op-eds in the Washington Post and other newspapers, has written for NASA’s history program, and is a science writer at a particle-accelerator laboratory.