Executive Summary
Moore's Law is the observation made in 1965 by Gordon Moore, co-founder of Intel, that the number of transistors per square inch on integrated circuits had doubled every year since the integrated circuit was invented. Moore predicted that this trend would continue for the foreseeable future.
In subsequent years, the pace slowed down a bit, but data density has doubled approximately every 18 months, and this is the current definition of Moore's Law, which Moore himself has blessed. Most experts, including Moore himself, expect Moore's Law to hold true until 2020-2025.1
Amazingly, Moore's Law applies not only to the computational speed of microchips that have been in steady nonlinear acceleration. It also refers to all the other components of the computer—the memory to store information; the software applications that enable computers to perform their myriad tasks individually and collectively; the sensors—cameras, transducers, and other devices to detect light, movement, position and sound and convert them into digital data streams; and communications within and between computers.
It's only been a decade of experience with this incredible phenomenon of expontential computing power, storage, data transmission, and ubiquitous connectivity coming together in what we know as cloud computing. Quite suddenly, we can interact with people on the other side of the planet in mere milliseconds. And so more people can compete, connect and collaborate on more things for less money with greater ease and equity than ever before.
The Future Impact of Moore's Law