First, I will admit, I am among the group of cloud advocates who routinely claim that cloud computing is green, I say this without any proof or evidence to support my statement. I make this claim as part of my broader pitch to use Cloud Computing, I say this as a sales and marketing guy, but not as an advocate. As an advocate I'd like to have some empirical data to support my position. Believe me, I've searched, and I've searched -- although there are piles of forecasts about the potential market for cloud computing, said to be in the billions, little exists to support the green / eco-friendly argument.
On the face of it, a major incentive to move to cloud computing is that it appears to be more environmentally friendly compared to traditional data center operational / deployment models. The general consensus says that reducing the number of hardware components and replacing them with remote cloud computing systems reduces energy costs for running hardware and cooling as well as reduces your carbon foot print while higher DC consolidation / optimization will conserve energy. But a major problem still remains, where is the proof?
The problem is there is no uniform way to measure this supposed efficiency. None of the major cloud companies are providing utilization data, so it is not possible to know just how efficient cloud computing actually is -- other then it sounds & feels more green.
The problem is measuring the hypothetical. What is the hypothetical footprint of a startup that may have chosen to built their own data center versus using someone elses? Things like transportation, development, construction, management, etc are very difficult to measure and arguably still create vast amounts of CO2, yet are generally not taken into consideration. Also the power sources can have dramatically different CO2 footprints, say a coal source Vs wind or Nuclear.
Then there is the question of consumption, we now have the ability to run our applications on thousands of servers, but previously this wasn't even possible. To say it another way, we can potentially use several years worth of energy in literary a few hours, where previously this wasn't even an option. So in direct contrast, hypothetically we're using more resources, not less. On the flip side, if we bought those thousand servers and had them running (under utilized) the power usage would be significantly higher. But then again, buying those servers would have been out reach for most, so it's not a fair comparison. There we are -- back, at where we started. You may use 80% less energy per unit, but have 1000% more capacity which at the end of the day means you're using more, not less energy.
I'm not alone in this thinking, more broadly, the the International Organization for Standardization (ISO) considers the label "Environmentally Friendly" to be too vague to be meaningful because there is no single international standard for this concept. Although there are a few emerging Data Center Energy Efficiency Initiatives, notably by the EPA in the United States through their Energy Star program. The EPA programs are working to identify ways in which energy efficiency can be measured, documented, and implemented in data centers and the equipment they house, especially servers. This may be the foundation for potential cloud "eco-friendliness", but until cloud computing providers step up and provide the data, it does little to resolve the question.
Let me be clear, it's not that I'm saying Cloud Computing isn't green, I'm sure that if you were to compare a traditional data center deployment to a near exact replication in the cloud you'd find the cloud to be more efficent, but the problem is there currently is no way to justify this statement without some kind of data to support it.
If you know of some hard data, please free to pass it along.