Recently I've been asked about the benefits of cloud computing in comparison to that of virtualization. Generally my answer has been they are an ideal match. For the most part virtualization has been about doing more with less (consolidation). VMware in particular positioned their products and pricing in a way that encourages you to use the least amount of servers possible. The interesting thing about cloud computing is it's about doing more with more. Or if you're Intel, doing more with Moore.
At Intel's core, they are a company driven by one singular mantra, "Moore's Law". According to wikipedia, Moore's law describes an important trend in the history of computer hardware: that the number of transistors that can be inexpensively placed on an integrated circuit is increasing exponentially, doubling approximately every two years. The observation was first made by Intel co-founder Gordon E. Moore in a 1965 paper.
Over the last couple years we have been working very closely with Intel, specifically in the areas of virtualization. During this time we have learned a lot about how they think and what drives them as an organization. In one of my early pitches we described our approach to virtualization as "Doing more with Moore" A kind of play on the common phases "doing more with less" combined with some of the ideas behind "Moores Law" which is all about growth and greater efficiencies. They loved the idea, for the first time someone was looking at virtualization not purely as a way to consolidate a data center but as a way to more effectively scale your overall capacity.
What is interesting about Moores law in regards to cloud computing is it is no longer just about how many transistors you can get on a single CPU, but more about how effectively you spread your compute capacity on more then one CPU, be it multi-core chips, or among hundreds, or even thousands of connected servers. Historically the faster the CPU gets the more demanding the applications built for it become. I am curious if we're on the verge of seeing a similar "Moores Law" applied to the cloud? And if so, will it follow the same principals? Will we start to see a "Cloud Law" where every 18 months the amount of cloud capacity will double or will we reach a point where there is never enough excess capacity to meet the demand?