For archival purposes, I'm reposting the article on ElasticVapor. To see the original post, please visit VMblog.com
--------------
The 2009 Cloud Experience
The year 2008 has been a big one for cloud computing. In a rather dramatic shift, we've seen the term "cloud" enter the collective IT consciousness. It seems almost every technology vendor, big or small, has embraced the movement to the cloud as a software and marketing philosophy. Generally, cloud computing can be viewed loosely as an Internet-centric software and services model. Specifically for the data center, cloud computing represents the opportunity to apply some of the characteristics of the decentralized, fault tolerant nature of the Internet.
Up until now software companies didn't have to concern themselves with the concepts of "scale" or adaptive infrastructure capacity. In the traditional 90's desktop software model, users of a given software were responsible for the installation, administration and operation of a particular application, typically on a single computer. Each desktop formed a "capacity silo" somewhat separate from the greater world around it. Now with the rising popularity of Internet based applications, the need for remote capacity to handle an ever expanding, always connected online user-base is increasingly becoming a crucial aspect of any modern software architecture. In 2008 we even witnessed the typically desktop-centric Microsoft jump into the fray outlining a vision for Software + Services (S+S) which they describe as local software and Internet services interacting with one another.
In looking at 2009 I feel the greater opportunity will be in the merger of the next generation of "virtualized" data centers with a global pool of cloud providers to create scalable hybrid infrastructures geared toward an optimal user experience. Cisco's Chief Technology Officer, Padmasree Warrior recently referred to this merger as the "intra-cloud". Warrior outlined Cisco's vision for the cloud saying that cloud computing will evolve from private and stand-alone clouds to hybrid clouds, which allow movement of applications and services between clouds, and finally to a federated "intra-cloud". She elaborated on the concept at a conference in November "We will have to move to an 'intra-cloud,' with federation for application information to move around. It's not much different from the way the Internet evolved." With Cisco's hybrid vision, I believe the end-user experience will be the key factor driving the usage of cloud computing both internally and externally.
To enable this hybrid future, cloud interoperability will be front and center in 2009. Before we can create a truly global cloud environment, a set of unified cloud interfaces will need to be created. One such initiative which I helped create is called "The Cloud Computing Interoperability Forum (CCIF)". The CCIF was formed in order to enable a global cloud computing ecosystem whereby organizations work together for the purposes of wider industry adoption of cloud computing technology and related services. The forum is a little over 3 months old and has grown to almost 400 members encompassing almost every major cloud vendor. A key focus of the forum is on the creation of a common agreed upon framework / ontology that enables the ability of two or more cloud platforms to exchange information in an unified manor. To help accomplish this, the CCIF in 2009 will be working on the creation of a Unified Cloud Interface (UCI) or cloud broker. The cloud broker will serve as an open interface for interaction with remote cloud platforms, systems, networks, data, identity, applications and services. A common set of cloud definitions will enable vendors to exchange management information between remote cloud providers. By enabling industry wide participation, the CCIF is helping to create a truly interoperable global cloud which should also improve the cloud centric user-experience.
In the startup space, there have been a number of notable companies showing up to address the concepts of "scale" by creating load based cloud monitoring and performance tools. For the most part, these tools have been focused on cloud infrastructure environments such as Amazon's Elastic Compute Cloud. There have also been an increasing number of cloud providers appearing on a regional level in Europe and Asia providing resources for the first time to scale on a geographical basis. Thanks in part to interop efforts as well as advancement in wide-area computing, we may soon be able to not only scale based on superficial aspects such as load, but based on practical aspects like how fast does my application load for users in the UK?
The quality of a user experience as the basis for scaling & managing your infrastructure will be a key metric in 2009. The general problem is a given cloud vendor/provider may be living up to the terms of their SLA's contract language, thus rating high in quality of service, but the actual users may be very unhappy because of a poor user experience. In a lot of ways the traditional SLA is becoming somewhat meaningless in a service focused IT environment. With the emergence of global cloud computing, we have the opportunity to build an adaptive infrastructure environment focused on the key metric that matters most, the end user's experience while using your application. Whether servicing an internal business unit within an enterprise or a group of customers accessing a website, ensuring an optimal experience for those users will be the reason they will keep coming back and ultimately what will define a successful business.
A key player in the emerging cloud as a user-experience enabler is Microsoft with their Microsoft UC Quality of Experience (QoE) program. Although initially focused toward VOIP applications, I feel the core concepts work well for the cloud. The MS QoE program is described as a comprehensive, user-focused approach to perceived quality centered on the actual users, and incorporating all significant influencing parameters in optimizing the user experience. Real time metrics of the actual experience help in measuring, quantifying and monitoring at all times the actual experience using live analytical information of the user's perceived subjective quality of the experience. QoE may very well be a key component in Microsoft's plan to dominate the cloud. It resonates well with what I'm hearing in the community at large. As trusted enablers, companies like IBM, Cisco, Sun, and Microsoft are in the prime spot to address this current trend toward "trusted" cloud computing providers. They have the know-how, global networks and most importantly the budgets to make this a reality.
Possibly the biggest opportunity in the coming year will be for those who embrace the "cloud stack", a computing stack that takes a real time look at a given user's experience. A computing environment that can adapt as well as autonomously take corrective actions to continuously optimize the user’s subjective experience on any network, anywhere in the world at anytime. Sun Microsystem's John Gage was right when he famously said, The Network Is the Computer. It's just taken us 25 years to realize that the "Internet" (the cloud) is the computer.