Thursday, April 30, 2009
US Federal CIO Cloud Computing Summit
Recently President Obama has set a directive that the government be innovative and collaborative, the Federal Chief Information Officers Council (CIOC) has established an initiative to explore the application of cloud computing to federal activities. What is becoming clear is cloud computing will play a very important aspect in the US Federal Government's forthcoming IT strategy. As part of this strategy, the CIOC has created a Federal Government’s Working Group on Cloud Computing. The purpose of this event is to solicit information and advice from the private sector on the issues facing the Federal government as it seeks to take advantage of cloud computing technologies.
The American Council for Technology (ACT), a non-profit public-private organization established to help government use IT, has been asked to host the meeting with leading private sector companies involved with cloud computing. The meeting will be held on May 6, 2009 in Washington, DC.
My goal is lobby for a greater cloud centric interoperability mandate from the federal government as well as increased co-operation between the federal government IT organization and the cloud industry, although this meeting itself is a huge step forward.
During the cloud summit I will make sure to mention the following key Issues for cloud computing with in the Federal government.
1. Security & Privacy
2. Data and Application Interoperability & Compatibility
3. Data and Application Portability
4. Governance and Management (IT Cloud Policy)
5. Metering, Monitoring & scaling
6. Mobile Access concerns
If anyone has any points they would like to me raise during the summit, please let me know.
I'll post more details in the coming days including a full overview after the summit.
Wednesday, April 29, 2009
Amazon Offering Academia Free Cloud Services
Amazon.com has announced a grant program that offers academia free access to its cloud services. The program, titled AWS in Education, offers $100 per student worth of free usage of its services, but only to courses and academics approved by Amazon. Faculty and researchers around the globe can apply online for the grants, which will be judged on the “originality” of their ideas. Amazon has committed to donating at least $1 million worth of free AWS access, but doesn’t have a set timeline.
According to the AWS in Education website, AWS in Education provides a set of programs that enable the worldwide academic community to easily leverage the benefits of Amazon Web Services for teaching and research. With AWS in Education, educators, academic researchers, and students can apply to obtain free usage credits to tap into the on-demand infrastructure of Amazon Web Services to teach advanced courses, tackle research endeavors and explore new projects – tasks that previously would have required expensive up-front and ongoing investments in infrastructure.
With AWS you can requisition compute power, storage, database functionality, content delivery, and other services — gaining access to a suite of elastic IT infrastructure services as you demand them. AWS enables the academic community to inexpensively and rapidly build on global computing infrastructure to pursue course projects and accelerate their productivity and research results, while enjoying the same benefits of reliability, elasticity, and cost-effectiveness used by industry.
The AWS in Education program offers:
- Teaching Grants for educators using AWS in courses (plus access to selected course content resources)
- Research Grants for academic researchers using AWS in their work
- Project Grants for student organizations pursuing entrepreneurial endeavors; Tutorials for students that want to use AWS for self-directed learning
- Solutions for university administrators looking to use cloud computing to be more efficient and cost-effective in the university’s IT Infrastructure
Monday, April 27, 2009
Cloud Seeding a Cloud Content Distribution Network (C2DN)
Cloud Seeding, The act of combining many smaller regional hosting firms & cloud service providers into a single global compute cloud.
Use Case: Enables the ability to create a low cost cloud content distribution network (C2DN) that combines a series of regional cloud service providers who cooperate transparently to deliver content to end users, most often for the purpose of improving performance, scalability, and cost efficiency.
Industry Led Cloud Standards Incubator Launched
The Distributed Management Task Force (DMTF), the organization bringing the IT industry currently led by: AMD, Cisco, Citrix, EMC, HP, IBM, Intel, Microsoft, Novell, RedHat, Savvis, Sun Microsystems, and VMware together to collaborate on systems management standards development, validation, promotion and adoption, today announced that it has formed a group dedicated to addressing the need for open management standards for cloud computing. The “Open Cloud Standards Incubator” will work to develop a set of informational specifications for cloud resource management.
As virtualization technology continues to be more rapidly adopted, it is emerging as a common enabling foundation for delivering software solutions into IT environments along with the potential to lower IT costs and improve operational efficiencies. While deploying virtualization technologies it is also critical to have comprehensive management capabilities associated with the implementation. Along with the adoption of virtualization, more and more enterprise IT customers are looking at the cloud computing paradigm to better deliver services to their customers.
"Ensuring interoperability among clouds is essential to the proliferation and adoption of cloud computing among developers and enterprise users," said Lew Tucker, CTO, Cloud Computing, Sun Microsystems. "Sun is committed to bringing open and interoperable clouds to market and is joining other leading vendors in actively engaging in this new DMTF effort."
No specific standards currently exist for enabling interoperability between private clouds within enterprises and hosted or public cloud providers. DMTF’s Open Cloud Standards Incubator will focus on addressing these issues by developing cloud resource management protocols, packaging formats and security mechanisms to facilitate interoperability.
“The new DMTF Open Cloud Standards Incubator is a step forward in enabling interoperability and portability between compute clouds,” said Dr. Stephen Herrod, CTO of VMware."The ultimate goal is to provide customers choice as to where they can most efficiently and safely run their applications. This may be in an internal cloud within their own datacenter or in clouds managed by external providers. With our anticipated launch of VMware vSphere 4, we take a step in this direction with full support of the Open Virtualization Format (OVF). Furthermore, we are committed to working with our partners in developing and supporting the other critical standards that will enable this open cloud computing vision."
What you will notice is there are no smaller startups or "cloud specific" companies such as Google or Amazon included. At first glance this looks like a list of the old boys of tech. I can't help but wonder if this will help or hinder the adoption. An open development process is critical to the success of any standards that may get created by this group. The last thing we want is to be forced to adopt a set of technical standards because our partners are telling us we need to. I'd rather see broad adoption from the smaller and bigger players alike as the driving factor when looking to implement a new set of technology standards.
My questions to the DMFT
How can we as a community help with this initiative? Are all contributors required to join the DMTF? Also, I didn't see in any of the press release info what the time frame is for the development of these standards?
For more info, see: http://dmtf.org/cloud
Thursday, April 23, 2009
Google Trends - Cloud Computing Surpasses Virtualization in Popularity
For those unfamiliar with Google Trends, it's a service that analyzes a portion of Google web searches to compute how many searches have been done for the terms you enter, relative to the total number of searches done on Google over time. Google shows you a graph with the results in a Search Volume Index graph. Below is a comparision of Cloud computing Versus Virtualization.
When looking at the graph its important to look at the growth curve. A great analogy for this type of growth is found in the investment world. When investors are looking for growth opportunities, it is common to look for what is described as a hockey stick moment. The "hockey stick" describes a highly desirable pattern in a company's sales growth (or in our case search popularity). Initially, sales start off at a low level and grow slowly over time, sketching in the blade of the hockey stick. Then, if all goes well, at some point sales start to increase more rapidly, creating the upward curve that is the stick's neck. And then, if this technology company is really onto something, sales take off and growth becomes almost vertical. That's the handle of the stick.
Looking at cloud computing from the point of view of search trends, we would appear to be at that hockey stick moment and based on the volume of interest I'm seeing for the Enomaly ECP, as well as new CloudCamps around the globe, the market for cloud computing is getting hot, very hot and the graph above does a great job of illustrating the opportunity.
Wednesday, April 22, 2009
Security Guidance for Critical Areas of Cloud Computing
According to Alliance co-founders Nils Puhlmann and Jim Reavis, the several months of collaboration on the white paper was worth the effort, "We would like to thank the many contributors to this initial effort. The great diversity of services offered via cloud computing requires careful analysis to understand the risks and mitigation appropriate in each case. At the same time, we see enormous potential for the cloud model to eventually simplify many difficult security problems. This initial deliverable is just the beginning of our efforts, and we would like to extend an open invitation to industry experts to help us create additional best practices for practitioners and the industry."
The Cloud Security Alliance is a not-for-profit organization with a mission to promote the use of best practices for providing security assurance within Cloud Computing, and to provide education on the uses of Cloud Computing to help secure all other forms of computing. The founding thought leaders behind the formation of the Cloud Security Alliance are leading security practitioners from a range of private and public organizations and leading security companies.
Tuesday, April 21, 2009
Introducing The Virtual Machine Trojan
So what is a Virtual Machine Trojan? According to Castro virtual machine trojans are seemingly benign virtual machine you download from the Internet contains a trojan. The objective of the trojan is to remotely take control of the machine for nefarious purposes: steal information, send spam, conduct click fraud, stage denial of service attacks within a botnet, etc.
ViMtruder is written in Python and consists of a client which is installed within a virtual machine, and a control server, which sits in a host on the Internet. The virtual machine, running Linux, is configured to automatically run the VMT client in the background upon boot up. The VMT tries periodically to contact the control server through the Internet using port 80 outbound. Once the control server links with the VMT, you can send it Nmap commands to scan the target LAN where the VMT is connected.
The types of attacks a VMT can execute are different than a normal trojan. The VMT does not have access to the host machine; rather, it has access to the local network. Therefore, a VMT can be programmed to do the following:
1) Sniff traffic in the local network
2) Actively scan the local network to detect machines, ports and services
3) Do a vulnerability scan to detect exploitable machines in the local network
4) Execute exploits in the local network
5) Brute force attacks against services such as ftp and ssh
6) Launch DoS attacks within the local network, or against external hosts
7) And of course, send spam and conduct click fraud
My first thought is imagine something like this embedded into an EC2 AMI and the potential damage it would cause.
Monday, April 20, 2009
Buzzword Alert: Cloud Service Enabler Versus The Private Cloud
Whether it's EC2 or Rackspace, Savvis or Gogrid, at some point there needs to be a single or series software components that enables the creation of the so called "cloud". Recently there seems to be a growing convergence with in the data center combining the best aspects of cloud computing including, SOA, utility, SLA, automation and virtualization into a single unified point of management. (A kind of turn key EC2) In a nutshell this is the opportunity for infrastructure cloud service enablers also know as a "private clouds" such as Enomaly.
So what is a Cloud Service Enabler? The fundamental software building blocks required in the creation of a cloud computing infrastructure.
Sun @ Oracle > Is Hardware Dead?
The software aspects of the deal actually makes a lot of sense. It certainly seems like a very obvious fit, Oracle is buying Java. Whether Larry Ellison admits it or not, Java is the key to enterprise focused cloud computing. It's the perfect language and format for the fluid movement between existing enterprise data center's and Sun's enterprise focused cloud offering. Both Oracle and Sun have a strong foot hold in the enterprise market, so this merger makes all the more sense purely from a sales point of view. Something Sun hasn't done very well at lately.
Other then making tonnes of money, what Oracle does best is manage a team of million dollar Ferrari driving sales guys in direct contrast to Sun which is best at managing teams of highly innovative technologists. Together these two companies have a unique opportunity to drastically change how technology is sold using a hybrid sales model made up of internal and external software components connected to a Java centric cloud.
It would also seem that Sun's hardware assets have little or nothing to do with the deal. Oracle is a masterful software marketing company, and over the last 10 years Sun has also transitioned to be a software company. From Solaris, to Java, MySQL to the forth coming Open Cloud Platform. Sun's greatest strengths are in its ability to bridge innovative technology concepts into actual cutting edge software products and services. Oracle's value lies in it's ability to successfully acquire software companies and eek out even higher profit margins then these companies could ever hope to on their own. I also think the deal means the end of the SPARC processor, and potentially the end of Sun Hardware. Maybe not immediately, but it certainly seems like a lower margin business for the traditionally high margin Oracle.
Another interesting opportunity that may arise out of the the Oracle Sun merger are the various M&A opportunities, particularly from IBM who must now compete with a newly empowered Oracle. It should be interesting to see who IBM buys in the next couple months. My vote is for EMC.
(Disclosure, I am on the Sun Strategic Advisory Counsel. These comments do not represent the view of Sun Microsystems or Enomaly Inc.)
Thursday, April 16, 2009
The Linux Foundation is recommending KVM over Xen
At the event Jim Zemlin, executive director of the Linux Foundation, said "that moving forward, the Linux Foundation is encouraging vendors and developers to standardize on KVM, not Xen."
Not that it makes too much difference in the short term, but in the long term this may mean that KVM, which is included directly in the Linux Kernel may very well become the de-facto choice for linux centric virtualization. What makes this even more important is lately it seems that Linux & virtualization in general is at the heart of most major cloud computing initiatives. It's going to be very interesting to see how this plays out.
McKinsey & Co - Clearing the Air on Cloud Computing
The McKinsey & Co. report titled Clearing the Air on Cloud Computing was recently made public and has been causing quite the uproar. Forbes, The Wall Street Journal, The New York Times and others have all written pieces quote the report saying "clouds are NOT very cost-effective"
According to the Forbes article, they (McKinsey & Co) conclude that outsourcing a typical corporate data center to a cloud service would more than double the cost. Its study uses Amazon.com’s Web service offering as the price of outsourced cloud computing, since its service is the best-known and it publishes its costs. On that basis, according to McKinsey, the total cost of the data center functions would be $366 a month per unit of computing output, compared with $150 a month for the conventional data center.
There a few problems with the main thesis of the report. First, the report assumes a fairly static & constant data center environment running 24/7. But seems to forget to mention the concept of elasticity. They also completely neglect to mention the new reserved instance option for EC2 which greatly reduces the cost for more predictable usage patterns.
Here is my math for the small reserved EC2 instance
@ $0.03 a hour, a small reserved EC2 instance will cost you about $262 a year for the uptime, and $325 for the reservation or about $48 dollars a month. Compared to about $876 a year or $73 a month using an on demand instance (not including storage and bandwidth). This new pricing model brings EC2 inline with most VPS style hosting providers.
In reading the report it is obvious that the purpose of the report is to debunk the hype around cloud computing. The report generally is well written and for the most part actually quite well thought out. In terms of a providing a counter weight to the generally optimistic outlook from other market research firms, they have done a pretty good job in outlining the cons.
Worth a read.
Wednesday, April 15, 2009
Open Cloud Manifesto Reaches 150 Supporting Companies
21vianet, Abiquo Corp., Accario Inc., Advanced Millennium Technologies, AgileCLOUD, Akamai, Alert Logic, Altic, American Data Company, AMD, Aptana, Appistry, AppZero, Arista Networks, ASPgems, AT&T Corp., attribo, Avail Intelligence, Averiware, BakBone Software, Barcelona Supercomputing Center, Bit Estudio S.L., BLADE Network Technologies, Blend digital, BlueLock, Bluescreen Network, Boomi, Business Logic, Cargojet, S.A., Cast lron, Cellopoint, Cisco, Clarified Networks, CloudVu, Cloudsoft Corporation, CORAID, Crescendo Networks, CSC, Dash Photos, DatR, Dilgenter, The Eclipse Foundation, Elastra, EMC, Engine Yard, Engineering Ingegneria Informatica, eNovance, Enomaly, enStratus, eyeOS, F5, FirstMile.US, Fireworks Software, Fotón Sistemas Inteligentes, Fulcrum Microsystems, Fusis Group, Getronics, Global Media Solutions, GoGrid, GRIDS Lab, University of Melbourne, GrokThis.net, Hariera, Inc., Heroku, Hitachi Data Systems, Hyperic, IBM, IGT, In Cloud Company S.L., inContact Inc., ISRAGRID, Juniper Networks, LanCord Systems, Inc., Litebi, LongJump, Luminis Innovation Consulting, Manjrasoft, ManualsMania, Metadot, metoCube, MFX, MyCube9.com, National Bank of Greece, Network Gulf Information Technology, NewServers, newScale, Inc., Northwest A&F University, North Carolina State University, Nirvanix, Novell, Object Management Group (OMG), Okuri Ventures, Onward Technologies, Opatan, Openliven, Open Cloud Consortium (OCC), Open-Xchange, OpSource, OXIA, Pentaho, Platinum Solutions, Prgmr.com, Process Maverick, PointStar, QuadraForte LLC, Quilitz&Quilitz Ltd, Quillcards, Rackspace, Rails Machine, RB Consulting, Red Rook, Red Hat, Rhinofly, RightScale, rPath, Saasmania, S3 Graphics, Sambam Internetwerken, SAP, Sevaa Group, Inc., Schaurer & Störger, Site5, STKI.INFO, SOASTA, Software AG, Sogeti, Solaiemes, Sun Microsystems, SWI Consultants, Synapses, Systems Ability Limited, TCOM, Technical University of Catalonia, Telefónica, Tech4Quant, Terremark Worldwide, The Hot Air Channel, The Open Group, The Reservoir Project, The Wikibon Project, Tortus Technologies, Trend Micro, Troxo, TypeHost Web Development, umc.global GmbH & Co KG, VALUE5, Velneo S.A., Venue Software, Veredas, Vivat Consulting, VoIP Logic, Voltage Security, VMWare, Voxeo, VPS.NET, W2P Limited, WaveMaker Software, Welfore GmbH, Zenoss
API’s Are The New Marketing Platform
In the post Kipp Bodnar a social marketing manager says;
"The future of marketing is about companies developing useful applications for their customers that extend web services that the customers are already using. This replaces the current model which is to use web applications to communication with customers. The problem with current social media marketing is the noise. A company is one of thousands, sometimes millions of users and it is easy to get lost. Developing applications via API’s provide a way for companies to break out of the crowd and at the same time create value for customers. Brands will need to become conduits that facilitate consumer communications instead or interrupters that intermittently drop in advertisements."
This is a very interesting concept which may well fit into the context of interoperability as a marketing tool. Lately it seems that a lot of the cloud providers are using their API as the basis for marketing their cloud offerings such as Google's recent attempt to bridge into your data center or even Microsoft's Software + Services strategy. Strange as it may sound, it's starting to look like the battle for the cloud may be won or lost on the API layer.
Read more at http://digitalcapitalism.com/2009/04/api-marketing/
Elasticvapor is 1 year old!
Here is an overview of my RSS readership via Feedburn. These stats doesn't include the nearly 200,000 people who have read the site directly through a browser.
Since April 2008.
- 164,927 views of 472 items
- 18,086 clicks back to the site on 472 items
Top 30 Posts by Popularity (RSS Readers)
Tuesday, April 14, 2009
Virtual R&D: Accelerating VM live migration
According to Lawton, "The time within which workloads can ebb and flow (and spike) is much quicker than the response time available to the scheduler to re-schedule VMs on other servers. As a result, the scheduler has to be ultra-conservative, otherwise it may break SLAs and/or create troublesome load-based hot-spots. By contrast, if the scheduler could expect near instantaneous VM migrations, it could perform much higher fidelity load-balancing or much more efficient power management (packing VMs onto absolutely the fewest number of powered-on servers). Thus, as live VM migration times decrease, the less conservatism is needed, and the greater the amount of potential performance and power savings can be wrung out of existing resources."
Very interesting and worth a read > http://www.virtualization.info/2009/04/r-accelerating-vms-live-migration-by-4.html
An Open Virtual Machine Format for Cloud Portability
According to the DMTF, OVF simplifies interoperability, security and virtual machine lifecycle management by describing an open, secure, portable, efficient and extensible format for the packaging and distribution of one or more virtual appliances and applications. This enables software developers to ship pre-configured, ready-to-deploy solutions, allowing end-users to distribute applications into their environments with minimal effort. They also go on to state that the standard can also serve as a building block for cloud computing. From a cloud interoperability marketing point of view, it sounds almost too good to be true. Luckily, it is as good as it sounds and generally is very well thought through.
But a major problem still remains -- not one infrastructure as a service provider currently supports the OVF standard. Which got me thinking. It's been almost two years since the original OVF specification has been submitted to DMTF (September, 2007), so why hasn't the apparently best / only option for VM centric cloud interoperability and portability not been adopted, period? It certainly doesn't appear to be a technical issue, maybe it's a business issue? If it's a business decision what might be driving it? From a customer point of view, being able to package and move a uniform and standardized OVF package would certainly seem compelling enough.
If you look at the hundred plus companies listed on the open cloud manifesto website there certainly seems to a willingness to associate yourself with the concepts of "openness" and interoperability. The one thing I will freely admit is that the open cloud manifesto has done a tremendous job in providing us with a great deal of market intelligence on cloud interoperability and portability. From what I can tell most generally agree that interoperability is an important issue that needs to be addressed. But the problem still remains that talk is cheap and actions speak louder then words and in our case action is adoption.
Over the last couple years since the OVF stardard was announced a number of the largest technology companies have stated they plan on supporting the OVF standard. A few have taken steps to prove their commitment including an IBM sponsored open source project Open-OVF, a VMware OVF Tool as well as Citrix's Project Kensho OVF Tool. So from an enablement point of view there certainly are tools to help in the adopt cycle. Yet OVF still isn't being adopted. Why?
So my final question is simple. Assuming OVF is the right format for open cloud portability, how can we as a community encourage cloud providers to start offering OVF support within their clouds?
Announcing the Enomaly Cloud Hosting Provider Program
Exciting news from Enomaly!
We will be releasing a version of our ECP technology specifically for carriers, xSPs and hosting providers looking to offer elastic cloud computing services -- the Enomaly Elastic Computing Platform, Hosting Provider Edition -- within the next few weeks.
This will extend our core ECP platform, already used by 15,000 organizations around the world, with the key capabilities needed by xSPs, carriers, and hosting providers who want to offer an IaaS service (think of Amazon EC2 + Auto Scaling + Management GUI, Metering + Quota in one box) to their customers, including not only a REST API and a simple, easy-to-use customer self-service interface, but flexible integration with a providers' existing billing, provisioning, and monitoring systems.
Shortly after this initial release, we will also deliver a set of unique security capabilities that will allow providers to strongly differentiate their offerings with enterprise-class Trusted Cloud services.
We are actively looking for an initial select group of global xSPs, carriers, and hosting providers to work closely with us as Charter Customers for the initial rollout of this new product edition. These Charter Customers will be rewarded with deep discounts on our platform, our highest level of support, and the ability to directly influence product direction and roadmap.
If you are a carrier, xSP, or hosting provider anywhere in the world, with between $1M and $1B of hosting-related revenue, and you'd be interested in exploring whether you may be able to benefit from this program, please contact Dr. Richard Reiner at +1 416 848-6036 x105 or our online registration form.
The Unified Data Center: Unified Computing Perspectives
--
For some the biggest buzz word so far in 2009 is “cloud”, for Cisco it’s “unified”. Today Cisco announced a new server centric strategy which is underpinned by the use of a “unified computing” methodology. This new unified approach to computing represents a radical shift in how we as an industry both visualize and manage a modern virtualized data center.
For Cisco Unified Computing seems to be an overarching mantra being applied to the broader management of data center resources (compute, storage, and network elements) through a singular virtualized point of interaction. In a sense they are attempting the unification of the the entire infrastructure stack in what some are calling a unified infrastructure fabric.
By being a new player to the server industry, Cisco’s approach to a Unified Computing System is an opportunity to completely reimagine the way we look at enterprise computing. Within this new vision for the data center is an integrated management platform that combines what they describe as a “wire once” unified fabric and API which utilizes a standards compliant computing platform. Cisco’s platform has the potential to uniformly interact with all layers of the enterprise computing stack while reducing the overall management costs. Given the macroeconomic factors of the current market, cost saving will certainly be an important area of differentiation. The stated goal is to cut the total cost of ownership by 20 percent in capital expenses and up to 30 percent in operating expenses, this in itself will be more important then any data center flexibility in the short term.
What’s been exciting for us at Enomaly is envisioning the potential for a singular infrastructure abstraction that can encompass the entire infrastructure stack as well as emerging cloud centric technologies through a unified application interface (API). At Enomaly we have been saying this for awhile and fully believe that this model represents the future of computing. The capability of easily combing and assembling all the various pieces within an ever changing virtualized infrastructure provides a tremendous opportunity to adapt legacy, single tenant environments with the new reality of a multi vendor cloud centric world. In this new model Cisco seems to grasp that computing is no longer limited to a single data center or application provider, but instead companies need to navigate a hybrid computing environment that may include local & remote resources, trusted & untrusted, yet are still expected to maintain consistent security, availability and uptime.
This approach allows a series of virtual applications that act as a set of application building blocks that can be easily managed and adjusted through a consistent and unified computing fabric. In this new adaptive data center model what is acting as an application server today may be a network switch or load balancer tomorrow. In a nut shell is what the vision for a unified computing and is the new reality facing all data center hardware & software vendors in the very near future.
Through my work with the Cloud Computing Interoperability Forum (CCIF) I know first hand the effort that Cisco has been doing to create an open platform that makes extensive use of existing industry standards. I believe that Cisco fully understands that to be successful in the current environment they must embrace open reference models, and standardized API’s. This will be an integral aspect in ensuring a compatible, interoperable as well as portable application centric infrastructure now and in the future.
What virtualization & cloud computing has done to the IT industry is open its eyes to the potential for the unification of application & infrastructure stacks. These two traditionally separate components are now starting to morph into each other. In the near term it may start to become very difficult to see where your application ends your infrastructure begins, unified computing will be at the heart of solving this problem.
Monday, April 13, 2009
The Case for a Cloud Computing Trade Association
In a recent report published by Gartner, the market research firm outlined the tremendous opportunity for global cloud services projecting revenues to increase 21 percent this year alone. According to Gartner cloud-based offerings made $46.4 billion in 2008, a number that is projected to increase to $56.3 billion in 2009 and $150.1 billion by 2013. With this phenomenal growth and revenue expected in the cloud computing sector, a few in the cloud industry have begun to ask whether it is time to form a cloud computing industry trade association. As many of you know I have been pushing for the creation of such an organization for awhile. I thought I'd briefly layout some of the opportunities I see for the creation of a cloud computing trade association and how it might look.
To give some background, according to the Wikipedia, "a trade association, is an organization founded and funded by businesses that operate in a specific industry. An industry association participates in public relations activities such as advertising, education, political donations, lobbying and publishing, but its main focus is collaboration between companies, or standardization."
First of all I'm not advocating for the creation of an organization focused on a particular ideology such as software licensing models or source code development but instead focused on accelerating the adoption of cloud computing by encouraging broader industry collaboration. More simply a formalized "legal" trade association which brings together both small & large companies (startup or enterprise), while also bridging the greater cloud community by including customers & users who all share a stake in the adoption / advancement of cloud centric technologies.
I believe the association should focus on the commonalities we share -- accelerating the adoption of cloud computing through a consensus view of the general opportunities cloud based technology brings to customers. I'm not speaking about defining what cloud computing is so much as defining the problems it solves and opportunities it enables. The things we can actually agree on.
To accomplish this there are a number of joint advocacy and marketing programs which the association may engage in. These could both include web centric activities including industry forums, social networks and online collaboration tools as well "in person" activities such as local user groups, unconferences and trade shows. The association may also be in a position to assist in the creation reference architectures, use cases, and white papers that help illustrate "how" & "why".
Another opportunity is in the active marketing & advertising of "cloud computing" through a uniform and constant brand using multiple mediums both online and off. A similar success story is the Wi-Fi Alliance. This group of industry leaders came together in 1999 to form a global, non-profit organization with the goal of driving the adoption of a single worldwide-accepted standard for high-speed wireless local area networking. Unlike Wi-fi, cloud computing isn't a single technology or standard, it's a broad representation of many loosely coupled technologies, companies, standards and communities. Therefore we must rethink how a industry trade association works and is formed, I've been loosely referring to this as a Trade Association 2.0.
One of the concerns I've heard repeatedly over the last couple weeks is the potential barriers to entry for participation in this type of association. The last thing this association should be is an inclusive club for a few select technology vendors and insiders. It needs to be available to all and should foster an engagement with both the existing community while also providing a formal / legal umbrella that the larger companies will feel comfortable participating in. I am also cognizant that it takes money to make money, so there needs to be a middle ground with potentially some of the larger vendors subsidizing the involvement of the smaller players and independents members. Simply membership should not be cost prohibitive.
The question of standardization also seems to keep reoccurring and is probably one of the most debated of the topics when discussing the creation of a cloud trade association. It is my opinion that that last thing the world needs is yet another standards body. There are dozens of existing groups and organizations that would be ideal partners. I say lets work with them. I would guess a good portion of the members of such a trade association would also have memberships in existing standards bodies already. I agree that there is no need to reinvent the wheel or boil the ocean. I'd rather see this association partner with the standards world rather then compete with it.
When it comes to structure and governance I freely admit I am no expert. Generally from the conversations I've had it seems that the best approach is one of a "meritocracy": literally, government by merit. The best example of a meritocracy in action is at the Apache Foundation, a 501(c)3 non-profit organization incorporated to support & protect the various open source Apache projects.
According to the Apache Foundation website, "When the group felt that the person had "earned" the merit to be part of the development community, they granted direct access to the code repository, thus increasing the group and increasing the ability of the group to develop the program, and to maintain and develop it more effectively. What is interesting to note is that the process scaled very well without creating friction, because unlike in other situations where power is a scarce and conservative resource, in the Apache group newcomers were seen as volunteers that wanted to help, rather than people that wanted to steal a position."
The question of incorporation (bylaws, strategic plans, tax, etc) is also important to address. Although it is fairly easy to incorporate a non-profit, the difficultly is in actually running it. We also have day jobs. Luckily there are a number of organizations in place to help with the back office aspects of creating and operating a trade association. Some of these groups include the IEEE-ISTO, Open Group and even dedicated association management companies such as Association Headquarters who's only task is to help manage the day to day operations of a trade association. My suggestion is we simply outsource the mundane operation aspects to a capable partner.Why now? The fact is we're no longer talking about a hypothetical industry with almost 50 billion in real revenue last year we as an industry have a huge opportunity to collaborate and capitalize on potentially one of the biggest technology shifts we've ever seen. Let's not waste it.
Gartner Names Enomaly Cool Vendor in Cloud Computing System and Application Infrastructure, 2009
To access the report, you’ll have to talk to Gartner first. It is available here.
-
*Cool Vendor Disclaimer
About Gartner's Cool Vendors Selection Process
Gartner's listing does not constitute an exhaustive list of vendors in any given technology area, but rather is designed to highlight interesting, new and innovative vendors, products and services. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness of a particular purpose.
Gartner defines a cool vendor as a company that offers technologies or solutions that are: Innovative, enable users to do things they couldn't do before; Impactful, have, or will have, business impact (not just technology for the sake of technology); Intriguing, have caught Gartner's interest or curiosity in approximately the past six months.
Friday, April 10, 2009
Cloud providers vow interoperability
I found the comments made by Adam Selipsky, vice president of product management and developer relations for Amazon Web Services particularly interesting.
Amazon, which provides cloud-based storage and computing services, believes that allowing customers to do "whatever they want" is vital, said Adam Selipsky, vice president of product management and developer relations for Amazon Web Services.
"We are open and continue to be. Customer choice is our philosophy; we offer a la carte services," said Selipsky. He noted that customers can program in any language, and that its emphasis on delivering low-level infrastructure services, such as hosted environments, does not force customers to make choices that only apply to that environment.
Read the rest here
Thursday, April 9, 2009
CloudCamp Austin, April 25th
Location: Austin City Limits
2504 Whitis Ave
Austin, TX 78705
Tentative Schedule:
Sat April 25th (10am - 4pm) to be followed by a Happy Hour at Little Woodrows
Local Organizers:
- Michael Cote
- Sara Dornsife
- John Willis
- Matthew Wolken
- Phil Fritz
Wednesday, April 8, 2009
Google's Cloud Bridges your Data Center
I have been lucky enough to be given access to review some of the new App Engine features. For me the inclusion of Java is the least exciting of the new features. The most exciting aspects are the addition of hybrid cloud components that let you use a combination of cloud based resources as well as traditional data center centric resources.
These features include;
- Access to firewalled data: grant policy-controlled access to your data behind the firewall.
- Cron support: schedule tasks like report generation or DB clean-up at an interval of your choosing.
- Database import: move GBs of data easily into your App Engine app. Matching export capabilities are coming soon, hopefully within a month.
SDC forms an encrypted connection between your data and Google Apps. SDC lets you control who in your domain can access which resources using Google Apps.
SDC works with Google Apps to provide data connectivity and enable IT administrators to control the data and services that are accessible in Google Apps. With SDC, you can build private gadgets, spreadsheets, and applications that interact with your existing corporate systems.
The following illustration shows SDC connection components.
The steps are:
- Google Apps forwards authorized data requests from users who are within the Google Apps domain to the Google tunnel protocol servers.
- The tunnel servers validate that a user is authorized to make the request to the specified resource. Google tunnel servers are connected by an encrypted tunnel to SDC, which runs within a company's internal network.
- The tunnel protocol allows SDC to connect to a Google tunnel server, authenticate, and encrypt the data that flows across the Internet.
- SDC uses resource rules to validate if a user is authorized to make a request to a specified resource.
- An optional intranet firewall can be used to provide extra network security.
- SDC performs a network request to the specified resource or services.
- The service verifies the signed requests and if the user is authorized, returns the data.
Wednesday, April 1, 2009
The Greatest Platform Shift
Through all the excitement something has become very clear. Cloud Computing isn't about a particular technology, standard, or api but instead broadly represents massive platform shift. A shift towards a computing future that isn't tied to a static computing environment, be it desktop, mobile or server. It's represents a completely new way at looking at technology. One that the technology world has yet to fully grasp. What the major technology players do seem to understand is that it's going to be a very big change. Herein lies the problem.
The Open Cloud Manifesto has come to represent this uncertainty. It's has less to do with the content of the document and much more to do with the uncertainty the underlies the technology industry as a whole. It represents a power struggle for a piece of a yet unknown pie.
So what is cloud computing? Possibly the biggest platform shift in the history of technology.