Moore’s Law is a difficult concept to wrap our heads around. We can see the impact of exponentially rising computing processing power around us but how do we relate to the doubling in power of a microprocessing chip every two years. We understand it, but do we truly “get-it”? Today’s edition of The Economist provides what I believe is one of the best layperson’s example of the incredible advances in processing power over the past four decades. In the article: The future of computing: The era of predictable improvement in computer hardware is ending. What comes next? , The Economist gives an excellent descriptor of Moore’s Law:
IN 1971 the fastest car in the world was the Ferrari Daytona, capable of 280kph (174mph). The world’s tallest buildings were New York’s twin towers, at 415 metres (1,362 feet). In November that year Intel launched the first commercial microprocessor chip, the 4004, containing 2,300 tiny transistors, each the size of a red blood cell.
Since then chips have improved in line with the prediction of Gordon Moore, Intel’s co-founder. According to his rule of thumb, known as Moore’s law, processing power doubles roughly every two years as smaller transistors are packed ever more tightly onto silicon wafers, boosting performance and reducing costs. A modern Intel Skylake processor contains around 1.75 billion transistors—half a million of them would fit on a single transistor from the 4004—and collectively they deliver about 400,000 times as much computing muscle.
This exponential progress is difficult to relate to the physical world. If cars and skyscrapers had improved at such rates since 1971, the fastest car would now be capable of a tenth of the speed of light; the tallest building would reach half way to the Moon.
The impact of Moore’s law is visible all around us. Today 3 billion people carry smartphones in their pockets: each one is more powerful than a room-sized supercomputer from the 1980s. Countless industries have been upended by digital disruption. Abundant computing power has even slowed nuclear tests, because atomic weapons are more easily tested using simulated explosions rather than real ones. Moore’s law has become a cultural trope: people inside and outside Silicon Valley expect technology to get better every year.
The article goes on to argue that Moore’s Law is coming to an end but that because of Cloud Computing, Deep Learning technology (AI) and new computing architecture – “specialised chips optimised for particular jobs, say, and even exotic techniques that exploit quantum-mechanical weirdness to crunch multiple data sets simultaneously” we can expect computing processing power to continue increasing at rapid rates. And, the final verdict on the end of Moore’s Law:
For more than 50 years, the seemingly inexorable shrinking of transistors made computers steadily cheaper and more capable. As Moore’s law fades, progress will be less metronomic. But computers and other devices will continue to become more powerful—just in different and more varied ways.
A very good article by The Economist and well worth the read.