Tuesday, March 30, 2010

Cloud Washing Goes Green(Peace)

Telling you that many in the technology industry have recently been engaged in what has been described as "Cloud Washing" -- or the attempt to ride a surge in interest found in anything relating to Cloud Computing (or with a cloud logo) may not come as a surprise. Adding to the noise is everyone's favorite environmental watch dog Greenpeace with a new report released earlier today titled "Make IT Green - Cloud Computing and its Contribution to Climate Change".

Before I comment let me make a few things clear.
1. I do want make sure the earth remains livable long after I'm gone from it.
2. Being environmentally friendly is not a chore, but an obligation we all share.
3. Economic Growth is good and should enable those fortunate enough to accommulate some weath the ability to help improve the lives of all people rich or poor.
4. As an entrepreneur I have an obligation to my family, myself and my shareholders to do whatever I can to accomulate as much monitary value as possble.
5. The previous 4 points are not mutually exclusive.
So back to the report.

To put it bluntly, it's logic is flawed. If' I'm reading this report right, which I may not be. The growth of the cloud computing is what makes it enharently less enivormentally friendly. In a nutshell it's our very success that makes cloud not so good for the enivornment.

Let me point out a few gems from the report;

"Cloud Computing is growing at a time when climate change and reducing emissions from energy use is of paramount concern. With the growth of the cloud, however, comes an increasing demand for energy. For all of this content to be delivered to us in real time, virtual mountains of video, pictures and other data must be stored somewhere and be available for almost instantaneous access. That ‘somewhere’ is data centres - massive storage facilities that consume incredible amounts of energy." > Agreed

Next up Facebook vs. Yahoo

"For example, in January 2010, Facebook commissioned a new data centre in Oregon and committed to a power service provider agreement with PacificCorp, a utility that gets the majority of its energy from coal-fired power stations, the United States’ largest source of greenhouse gas emissions. Effectively becoming an industrial-scale consumer of electricity, Facebook now faces the same choices and challenges that other large ‘cloud-computing’ companies have in building their data centres. With a premium being placed on access to the cheapest electricity available on the grid. In many countries, this means dirty coal."

They go on to point out.. "other companies have made better decisions for siting some of their data centres. Yahoo!, for instance, chose to build a data centre outside Buffalo, New York, that is powered by energy from a hydroelectric power plant - dramatically decreasing its carbon footprint. Google Energy, a subsidiary of cloud leader Google, applied and was recently approved as a regulated wholesale buyer and seller of electricity in the United States, giving it greater flexibility as to where it buys its electricity to power its data centres."

Now let me put on my Gordon Gekko hat for a moment. -- Growth is Good, Growth Works, Growth for lack of better term works -- Do you actually think the Google, Yahoo and others were able to grow to the size they are today using green data centers? No way, they did it the same way every other company does, using the power and energy sources they had available at the least cost to them. Yep, Coal. Sure, after you go public and have billions of dollar available it's easy to go put on your Green Hat. But being competitive, specially as a newer player in the business scene means you don't have luxury of having billions of dollars to play with. In order to have this kind of money at your disposal you must first grow. Grow your user base, grow your revenue and grow your profits. Greenpeace is completely forgetting what allowed Google and Yahoo to get to the point they are at today and instead expect everyone to be on some kind of bizarre equal footing which isn't possible.

The report is also full of interesting numbers, which to me are pretty much, well numbers. Here's one of the more interesting ones "PC ownership will quadruple between 2007 and 2020 to 4 billion devices, and emissions will double over the same period, with laptops overtaking desktops as the main source of global ICT emissions (22%)"

I do agree with some of the report, in particular "More cloud-computing companies are pursuing design and siting strategies that can reduce the energy consumption of their data centres, primarily as a cost containment measure. For most companies, the environmental benefits of green data design are generally of secondary concern."

Regardless of whether or not you call it Cloud Computing or just computing. This trend toward data center centric or Internet centric computing isn't new. It's been a long time coming. And if anything the move away from traditional desktop centric computing to network based alternatives consolidates computing resources from a broader heterogeneous distributed group of users with little or no control over power consumption to a more central point of energy consumption. Let me put it another way, those 10 million people who never turn their desktops off are not green either. But a data center that is optimized and has it in their best interest to save a buck by making their infrastructure "Green" is much more likely to do so. Not for the sake of saving the planet, but for the sake of of making more money.

At the end of the day, although the report's logic is generally flawed, the rationale for the report is not. The environment is important. I can't argue with that. The document is definitely worth the read.

Monday, March 29, 2010

Forget BigData, Think MacroData & MicroData

Recently I've been hearing a lot of talk about the potential for so called "BigData" an idea that has emerged out of Google's use of a concept of storing large disperse data sets in what they call "BigTable". The general idea of Google's BigTable is as a distributed storage system for managing structured data that is designed to scale to a very large (Big) size: petabytes of data across thousands of commodity servers. Basically a way for Google's engineers to think about working in big ways -- a data mantra.

Many of the approaches to BigData have grown from roots found in traditional grid and parallel computing realms such as non-relational column oriented databases systems (NoSQL), distributed filesystems and distributed memory caching systems (Memcache). These platforms typically forming the basis for many of the products and services found within the broader BigData & N0SQL trends and associated ecosystem of startups. (Some of which have been grouped in the broad Cloud computing category) The one thing that seems consistent among all the BigData applications & approaches are found in the emergence of the concept from within very large, data intensive companies such as Google, Yahoo, Microsoft and Facebook. Because of the scale of these companies, they were forced to rethink how they managed an ever increasing deluge of user created data.

To summarize the trend in the most simplistic terms -- (to me) it seems to be a kind of Googlification for IT. In a sense we are all now expected to run Google Scale infrastructure without the need for a physical Google infrastructure. The problem is the infrastructure Google and other large web centric companies have put in place have less to do with the particular technological infrastructure and more to do with handling a massive and continually evolving global user base. I believe this trend has to do with the methodology that these companies apply to their infrastructure or more importantly the way they think about applying this methodology to their technology.

I think this trend towards BigData may also potentially miss the bigger opportunities - mostly because of the word BIG. For most companies size in terms of raw storage is less important than scale. And when I say scale (relative magnitude) mostly what I mean is the time takes me to get a job done (logarithmic scale). Again the problem with BigData is its relative nature of the word "Big" -- how big is big? Is my notion of big the same as yours? I'm just not sure How big is Big? And just because I have petabytes of data doesn't mean I should work on petabyte workloads. I believe a better and more descriptive terminology would be one that is less subjective yet still broad enough to describe the problem. Think MacroData or MicroData.

Macro = Very large in scale or scope or capability
Micro = Extremely small in scale or scope or capability

The bigger question is what happens when you start to think about handling data as many smaller workloads (Micro), which collectively may be distributed across many geographies and environments (Macro). On a macro level the data could be very large in size but singularly on a micro level could be just a few bytes in size (Think a Twitter Status Update). Also the benefit to thinking about data from a micro or macro stand point is you start to think about the metrics that matter most, how fast I can achieve my particular goals. For me the goal is getting data analyzed in a time frame near or as close to real time as possible. This means data workloads that are processed in smaller byte sized pieces when they are created and (if possible) before they are stored / warehoused. It's easier to read data than it is to process data into information.

Thinking about data in terms of scale and time will result in a significantly more useful ways of potentially solving many more real world problems. I'm very interested in hearing other opinion on this -- am just getting caught up on semantics of the English language?

Friday, March 26, 2010

April Cloud Travel: Beijing & Rio de Janeiro

I've got a busy month of travel coming up in April with trips to China and Brazil. As part of my schedule I usually try to set aside some time for meeting with interesting local companies involved in cloud computing and adjacent areas. If you're interesting in meeting up for lunch drinks or a leisurely stroll through the Amazon -- Jungle, please let me know.

Beijing, China - April 12-16th - Intel Developers Forum (13-14th)

Rio de Janeiro, Brazil - April 26-30th - CloudCamp Rio (27th)

As a side note, we're still looking for additional sponsors for CloudCamp Rio, if you're interested in helping out please let us know. You'll get some visibility in one of the fastest growing markets for cloud computing.

Thursday, March 25, 2010

VAPOR = Virtualized Automated Provisioning Of Resources

I wanted to let everyone know about an interesting acronym created by AT&T's Cloud Guru @JoeWeinman which does a great job of decribing what a private cloud is. (V)irtualized (A)utomated (P)rovisioning (O)f (R)esources or VAPOR. Just add an elastic and I think we may be on to something here. So next time someone asks whether private clouds are real, tell'em they're just Vapor, or if you prefer ElasticVapor.

Weinman has also come up with an interesting acyonmy for CLOUD > (C)ommon, (L)ocation-independent, (O)nline, (U)tility, on-(D)emand; Basically saying Clouds are ubiquitous.

I'd also highly recommend following
Weinman's Blog at > Cloudonomics.com
Reblog this post [with Zemanta]

Tuesday, March 23, 2010

The Cloud of Unknowing

What I find interesting is for the many pitching cloud services today there seems to be a kind of growing duality around the various claims being made for and against using cloud computing. For those without a clear business model, it's represents everything wrong with technology, it's insecure, used by criminals, more expensive and therefore can't be trusted -- it's worse than bad because it's new. For those who do have a business model (or think they do) it's more cost effective, more scalable, more efficient, basically it's just better because it's new.

What is clear are most of these companies making the various claims for or against its use suffer from the same problem, it's practically impossible to prove or disprove their claims. This is because of the very nature of the term itself has become all encompassing. As Larry Ellison pointed out, what isn't cloud computing these days? It's basically anything on or touching the Internet. For those who are arguing their points it seems to come down to a belief rather than any kind of factual information they have. You believe it's better so it is or vice versa.

At the end of the day "cloud" is what it is - a marketing buzzword, nothing more nothing less. So I say enjoy the buzz while it lasts because inevitably there will be something new and shinny to replace it.

Wednesday, March 17, 2010

Elastic Regionalized Performance Testing

I'm happy to announce that we (Enomaly) have teamed up with the leader in cloud testing SOASTA to validate that cloud service providers can deliver the demanding SLAs increasingly required by their customers using Enomaly’s Elastic Computing Platform (ECP). ECP’s carrier-class architecture supports very large cloud platforms, spanning multiple datacenters in disparate geographies. SOASTA's CloudTest® methodology helps ensure that customers achieve high performance from their cloud infrastructure anywhere in the world.

So what does this mean? In addition to the ECP platform, the hardware used by the individual service provider and the applications they host can have a significant impact on performance. SOASTA is providing its CloudTest service to Enomaly service providers so they, in turn, can build a high level of confidence in their customers’ website performance. This is particularly important as companies are increasingly responding to unexpected peaks that come from the impact of social networking, external events and promotions. Companies must accurately test dynamic applications at web-scale, often running in complex infrastructures. SOASTA's cloud test provides the proof that your application can and will scale.

Another important factor is the concept of regionalized ECP based clouds. From Sweden to Japan SOASTA's CloudTest service combined with Enomaly's global customer base of cloud service providers will allow for the first time a real world environment for geographic specific performance testing. Before the emergence of regionalized cloud infrastructure this kind of Elastic Regional Performance Testing was not even a possibly. Wondering how your web infrastructure will perform in Japan using resources in Japan? Well wonder no more.

Monday, March 15, 2010

Data is the New Oil

I'm sitting in my hotel room in Santa Clara at the Cloud Connect event. A conference focused on the future of business computing -- so what better a setting to discover that Facebook has passed Google as most-viewed site in US in past week. An amazing feat to say the least, but why? How did this come to be? How did in a little over six years an upstart manage to surpass Google?

To understand you really need to think what the PC era has done to information. In effect the PC revolution started what the Internet has super charged - Information Creation. Think about it, more information is now being creating in the time it takes me to write this post than was probably created between the time humans first figured out how to write up until the birth of the Internet.

But for the most part the majority of the information humankind has created has not been accessible. Most of this raw data or knowledge has been sitting in various silo's -- be it a library, a single desktop, a server, database or even data center. But recently something changed, the most successful companies of the last decade have discover how to tap into this raw data. These companies are better at analyzing, mining and using this mountain of data sitting "out there" -- turning a useless raw resource into something much more useful, Information.

Before you say anything, Yes I know I'm not the first to say this. In a 2006 post Michael Palmer wrote "Data is the new oil!" declaring "Data is just like crude. It’s valuable, but if unrefined it cannot really be used. It has to be changed into gas, plastic, chemicals, etc to create a valuable entity that drives profitable activity; so must data be broken down, analyzed for it to have value."

In the most simplistic terms Palmers post serves as a kind of Data manifesto directly outlining the reason why companies like Facebook, Twitter and Google will rule the next generation of computing. Not because they have more money, but instead because they have tapped into something much larger. They have figured out that those who can most effectively turn data into information will win.

Facebook & Twitter in a sense created the largest social analytics engine on the planet. They essentially know what we're thinking before we do. Using this raw data they can effectively predict trends and more importantly capitalize on these trends with the greatest of ease.

A recent article in the Economist puts the idea of data as power into perspective. "When the Sloan Digital Sky Survey started work in 2000, its telescope in New Mexico collected more data in its first few weeks than had been amassed in the entire history of astronomy. Now, a decade later, its archive contains a whopping 140 terabytes of information. A successor, the Large Synoptic Survey Telescope, due to come on stream in Chile in 2016, will acquire that quantity of data every five days."

The article goes on to point the use case "All these examples tell the same story: that the world contains an unimaginably vast amount of digital information which is getting ever vaster ever more rapidly. This makes it possible to do many things that previously could not be done: spot business trends, prevent diseases, combat crime and so on. Managed well, the data can be used to unlock new sources of economic value, provide fresh insights into science and hold governments to account."

Now just imagine anyone with a credit card given access to near limitless cloud computing resources. Yes, Data is the new Oil.

Wednesday, March 10, 2010

Ubiquitous Computing - Merging Banks & Mobile Providers

I've been fairly swamped this week with meetings so I really haven't had much time to think about writing any blog posts, but some recent news has inspired me. Before I get into my post, I also wanted to let everyone know that I'll be in Santa Clara next week attending & speaking at the Cloud Connect Conference. So if you want to meet up, ping me.

As part of Cloud Connect I'll be on the Ubiquitous Computing panel with Alistair Croll Co-Founder, Bitcurrent. The topic actually relates quite well to some of the latest news coming out of China this week.

For those of you unfamiliar with the concept, the general idea of Ubiquitous Computing is a shift in computing where a small, inexpensive, robust network of processing devices, is distributed at all scales throughout everyday life and generally used in common-place objects and purposes. More simply, computers are in everything and everywhere. From carton of milk to bus passes. For me, this ideal is an obvious first step towards a pervasive use of technology within the more mundane aspects of everyday people's lives. One where you have access to all the information you'll never need when you need it.

Over the last few years many have argued that this ubiquitous transition is either currently underway or a trend that is still years away. From where I sit I believe that this trend is now fully underway and you need not look any further than some of the moves happening in the Banking and Mobile industries as proof.

As further proof of this transition this week China Mobile Ltd., the world’s largest phone company by market value (with over 508 million customers) has agreed to buy 20 percent of Shanghai Pudong Development Bank Co. for ($5.8 billion) in China to expand its electronic-payment business. According to a Businessweek article, China Mobile and Pudong Bank will form a strategic alliance to offer wireless finance services including mobile bank cards and payment services.

At first glance you might say why is a phone company buying a major stake in a bank and more importantly why should I care? First of all, if you've ever traveled to Asia, than you'll instantly know why. In the West we tend to use credit cards and debt cards for most transactions -- and I'm told some actually still use paper money too. But in Asia, mobile phones are quickly becoming the preferred method for buying everything from Subway access to dinner. Through the use of RFID or other means the mobile phone seems to be the way most prefer to pay for things both big and small.

China Mobile isn't alone in seeing the opportunity in merging the more traditional banking aspects with the fast growing mobile market. Last year In South Korea, SK Telecom agreed to buy a stake in Hana Financial Group Inc.’s credit-card unit, while Globe Telecom Inc. agreed to buy 40 percent of BPI-Globe BanKO Savings Bank in 2008. There have been similar deals in Japan and other areas as well. At the heart of this transformation is the concept of ubiquitous network access to computing resources through a pervasive network of mobile devices. It will be interesting to see if others will follow the lead of China Mobile and embrace this new always connected world of computing gadgets.

Thursday, March 4, 2010

Enomaly ECP 3.1.0 Service Provider Edition Released

Just a quick note to let everyone know about the latest ECP 3.1 release. Among the various improvements is the addition of sparse disk support. In a nutshell sparse disk support allows you to over-commit your storage environment offering more storage than you actually have available. The advantage of sparse files is that storage is only allocated when actually needed: disk space is saved, and large files can be created even if there is insufficient free space on the file system. For example you can give a customer a 2tb storage quota, but let's say your customer only uses 3mb of actual data, than only 3mb of files are written to the disk. This means you can offer (and potentially charge for) storage you don't physically have available. It also means that sparse root images can be provisioned significantly faster, because, well you probably already guested it, less physical storage is required even for VM's that appear to have larger boot/root volumes. Yet another way we help cloud service providers make money.

Here is the release overview.

Enomaly is proud to announce the latest release of ECP Service Provider Edition. This version brings the following improvements and changes:

  • VMCreator (CLI and GUI) has been updated to allow selection of Sparse or Raw disk images. This allows a choice between higher performance Raw images vs. smaller on-disk and faster provision time Sparse images.
  • Increased performance and stability of disk I/O back end.
  • Reduced timeouts on SSL front end.
  • Many improvements to REST API calls.
  • Security enhancement to GUI VMCreator. ISO Images will now need to be uploaded to server, before provisioning.
  • Additional real/virtual disk usage tracking in Admin UI graphs to accommodate sparse images.
  • Various security improvements to ECP application repo and core.
  • Billing delegates can now be created to allow non-admin users to access metering information.
If you're an exisiting ECP customer, just run the typical YUM update.

#DigitalNibbles Podcast Sponsored by Intel

If you would like to be a guest on the show, please get in touch.

Instagram