Was Moore's Law Inevitable?

facebooktwitterreddit

Moore's Law is a popular axiom in computing that effectively says: every two years, the number of transistors on an integrated circuit doubles. In practice, this means that computers gets faster, cheaper, fast...and this doubling gets out of control very quickly.

The "law" is named for Intel's Gordon Moore, who is popularly credited with observing this doubling trend in 1965. Above is a chart (from Wikipedia) that demonstrates evidence of Moore's Law from 1971 through 2008. For now, Moore's Law still seems to be cranking along, just as it has for over thirty years. But why does it work? Is Moore's Law just how technology works, an inevitably byproduct of science married to business? What can we learn by digging into Moore's Law, its history, and how it applies to human endeavors outside of computers? A fascinating new article by Kevin Kelly digs into the question, including an excellent history of exponential growth in computers and other industries. Kelly writes:

While expectations can certainly guide technological progress, consistent law-like improvement must be more than self-fulfilling prophesy. For one thing, this obedience to a curve often begins long before anyone notices there is a law, and way before anyone would be able to influence it. [...] Ray Kurzweil dug into the archives to show that something like Moore's Law had it origins as far back as 1900, long before electronic computers existed, and of course long before the path could have been constructed by self-fulfillment. Kurzweil estimated the number of "calculations per second per $1,000" performed by turn-of-the-century analog machines, by mechanical calculators, and later by the first vacuum tube computers and extended the same calculation to modern semiconductor chips. He established that this ratio increased exponentially for the past 109 years. ...

Read the rest for an excellent, insightful article on Moore's Law and what it means.