Monday, October 26, 2009

What Comes After The Cloud?

Lately I seem to feel like that 80's Rock Band that had that one big hit, doomed to play the same song night after night. In my case I happened to stumble upon this thing called Cloud Computing a little earlier then most. Over the last 6 years or so I've watched as the concept of outsourced web centric IT go from a fringe concept to an overly hyped, albeit under adopted buzz word. I've watched just about anything with the word "cloud" attached to it take off.

When I speak at conferences my presentations have shifted from "what is" or "how does" to "where do we go from here?" It seems that somewhere along the way people started asking me to act as a kind of futurist or more specifically, to speculate about the future. And funny as this may sound, I actually quite enjoy this new role of prognosticator. So in keeping with the theme, I'm going to prognosticate a little bit on this October evening in the year 2009.

So what's next you say? Some say the semantic web, to me this is obvious, just applying semantics to what is already underway. Other's say ubiquitous computing, I say this an extension to what has for many already become ubiquitous. Others say maybe green, maybe mobile, maybe global. Regardless of the specifics, technology is quickly becoming the central aspect in many established economies. In other parts of the world having a mobile phone is on par with a basic civil right like water or food. To not have this, the most basic form of communication is in a sense, to not exist at all.

Yet there is a disconnect between the highly connected western world and highly socially interconnected emerging economies. In looking at the opportunities for technology in the coming decades and beyond, stop focusing on the specific features & functions within technology. Look to the opportunities for social change on an individual basis. The enablement of those who previously were not enabled.

I believe that the biggest opportunities will be for enabling those with the least. So my prediction is simple, Want to Get Rich? Sell to the Poor. Places like Brazil, China, India and Malaysia are in the midst of rapid and amazing transformations. These transformations are being brought about by access to technology that was previously unavailable. Access to information and affordable technology will be at the heart of this transformation. Feed this hunger and you will empower the people within these economies. Rather then focus on globalizaton, focus on regionalization. Focus on how technology will effect that single human being.

What is clear is the 20th century belonged to the West, and the 21st will belong to the rest.

Reblog this post [with Zemanta]

Google Liberates Your Docs

In a note to the CCIF list, Sam Johnston informed us that Google too has continued on its promise to liberate our data as part of their Data Liberation Front project. This latest Google effort introduces a new feature that makes it much easier to get your content back out of the Cloud using a tool that lets Google Doc's users easily "Convert, Zip and Download."

It's interesting to note that both Microsoft and Google released completing "open" initiatives today with Mircrosoft announcing they are opening the PST format for Outlook. It's great to see both companies actively battling it out for "Open Cloud" supremacy.

In the Blog post earlier today, Google outlined the ability to "Select one or more files and then click on "Export" from the "More Actions" menu. Next, pick the format (e.g. PDF, Microsoft Word, etc) you want for your exported files. Finally, click "Continue" and we'll give you a nice zip file to download that has all your content."

According to the post, "For now, you can "export" up to 500 MB of content in a single zip file, which is over 20,000 typical files. Sometimes it takes us a few minutes to export really large amounts of files, so instead of making you wait, we added an "Email when ready" option. We'll send you a link when the zip file is ready."

Give it features a whirl and let Google know what you think.
Reblog this post [with Zemanta]

CloudCamp in the Cloud (Recap & Video)

Last Thursday we held our first ever CloudCamp in the Cloud and so far the feedback has been very positive. A few pointed out that we could have done a better job with the chat or created an official backchannel of somesort as well as a more organized series of breakouts. But all in all I'm very happy with the results. For anyone interested, I've created a CloudCamp Channel where I've posted the Audio/Video from the Camp. Going forward we hope to do more these possibly on a weekly or biweekly basis. Also you can grab the podcast from dropbox.

See the Video below, and I have no idea what I'm doing with my hands at the beginning of the Video, guess I was nervous or something. ;)

Microsoft "Opens" Outlook Personal Folders Format (.pst)

Big news from Microsoft today. In a blog post to the interoperability @ Microsoft blog, Paul Lorimer, Group Manager, Microsoft Office Interoperability announced they will be "opening" the Outlook Personal Folders format also called a .pst file.

Lorimer said that "In order to facilitate interoperability and enable customers and vendors to access the data in .pst files on a variety of platforms, we will be releasing documentation for the .pst file format. This will allow developers to read, create, and interoperate with the data in .pst files in server and client scenarios using the programming language and platform of their choice. The technical documentation will detail how the data is stored, along with guidance for accessing that data from other software applications. It also will highlight the structure of the .pst file, provide details like how to navigate the folder hierarchy, and explain how to access the individual data objects and properties"

He also admitted that that the documentation is still in its early stages and work is ongoing. Going on to say "We are engaging directly with industry experts and interested customers to gather feedback on the quality of the technical documentation to ensure that it is clear and useful. When it is complete, it will be released under our Open Specification Promise, which will allow anyone to implement the .pst file format on any platform and in any tool, without concerns about patents, and without the need to contact Microsoft in any way."

This initiative is part of Microsoft's Interoperability Principles, which they announced in early 2008. As part of this initiative Microsoft has committed product features, documented formats, and implementation of standards that allow interoperability. The move to open up the portability of data in .pst files is another step in putting these principles in action.

Lorimer also said that "Over the past year, Microsoft Office has taken several steps toward increasing openness and documenting interoperability guidelines, offering customers a choice of file formats and embracing a comprehensive approach that includes transparency into our engineering methods, collaboration with industry stakeholders, and shared stewardship of industry standards"

This is a great move by Microsoft!

Wednesday, October 21, 2009

EuroCloud & The Case for a Cloud Computing Trade Association

As many of you know I've been pushing for the creation of a formal Cloud Computing Trade Association for quite sometime, but unfortunately we've lacked both the will of the industry and the money to make this happen. Given the recent announcement of a EuroCloud Cloud Computing association in Europe, the opportunity to do something here in North America or more broadly has re-emerged. In my previous discussions there has been a tremendous amount of interest in the creation of an International Cloud Computing Trade Association, but when it comes down to it, the money to fund such an endeavor wasn't there.

So I again would like to pose the the question if now is the right time to reengage this conversation? Or am I beating a dead horse?

As pointed out by Fred Zappert on the CCIF list early today, more then thirty companies have joined forces to form a new European Cloud Computing Association called EuroCloud. The purpose of this new group is to bring together Euro-based SaaS and cloud computing vendors, enablers, integrators and industry experts to share best practices and promote new business opportunities across the continent.

The creators of EuroCloud have done a great job of outlining the rationale behind why they created the trade org. Similarly these points could apply to an International Cloud Organization.

  • Europe has a fast growing SaaS and Cloud Computing industry, but each country is currently operating separately with few contacts in other European countries.

  • National SaaS vendors are growing and are looking to build European and international relationships through business and technological partnerships.

  • The European Authorities do not currently recognise the European Cloud Computing industry, which is an industry that can help stimulate the economic and technological environment to promote new Cloud Computing industries.

  • Cloud Computing implies application integration into an Application-Oriented Ecosystem. Developing new application partnerships, both european and worldwide represents the next crucial step.

Eurocloud, the pan European Cloud Computing business network, through action at both local and European levels will help to answer these new demands.

EuroCloud Goals
  • To build a pan European network organized in two tiers with a national level (France, Espagna, England, Belgique, etc.) and a European level. The national level focuses on local topics and the European level on European topics,under the EuroCloud brand (or another if appropriate in a national setting). Only companies who have an interest in Cloud Computing and participate in the Cloud ecosystem can be members of the network.

  • Build relationships with the European authorities (Commission and Parliament) to help recognise the Cloud Computing industry as the future of IT in Europe and to promote a stimulating environment for development and growth of the industry.

  • Promote business relationships between members throughout Europe and internationally with counterparts such as SIIA.

  • Promote technological relationships between members throughout Europe and internationally.

Redux: The Case for a Cloud Computing Trade Association (Originally posted April 13th 2009)

In a recent report published by Gartner, the market research firm outlined the tremendous opportunity for global cloud services projecting revenues to increase 21 percent this year alone. According to Gartner cloud-based offerings made $46.4 billion in 2008, a number that is projected to increase to $56.3 billion in 2009 and $150.1 billion by 2013. With this phenomenal growth and revenue is expected in the cloud computing sector, a few in the cloud industry have begun to ask whether it is time to form a cloud computing industry trade association. As many of you know I have been pushing for the creation of such an organization for awhile. I thought I'd briefly layout some of the opportunities I see for the creation of a cloud computing trade association and how it might look.

Generally the idea is for the creation of a formal trade cloud computing association founded and funded by businesses that operate in a our industry. The industry association would engage in various public relations activities such as advertising, education, lobbying and publishing and certification, but its main focus is collaboration between companies.

First of all I'm not advocating for the creation of an organization focused on a particular ideology such as software licensing models or source code development but instead focused on accelerating the adoption of cloud computing by encouraging broader industry collaboration. More simply a formalized "legal" trade association which brings together both small & large companies (startup or enterprise), while also bridging the greater cloud community by including customers & users who all share a stake in the adoption / advancement of cloud centric technologies.

I believe the association should focus on the commonalities we share -- accelerating the adoption of cloud computing through a consensus view of the general opportunities cloud based technology brings to customers. I'm not speaking about defining what cloud computing is so much as defining the problems it solves and opportunities it enables. The things we can actually agree on.

To accomplish this there are a number of joint advocacy and marketing programs which the association may engage in. These could both include web centric activities including industry forums, social networks and online collaboration tools as well "in person" activities such as local user groups, unconferences and trade shows. The association may also be in a position to assist in the creation reference architectures, use cases, and white papers that help illustrate "how" & "why".

Some of the concerns I've heard repeatedly is the potential barriers to entry for participation in this type of association. The last thing this association should be is an inclusive club for a few select technology vendors and insiders. It needs to be available to all and should foster an engagement with both the existing community while also providing a formal / legal umbrella that the larger companies will feel comfortable participating in. I am also cognizant that it takes money to make money, so there needs to be a middle ground with potentially some of the larger vendors subsidizing the involvement of the smaller players and independents members. Simply membership should not be cost prohibitive.

The question of standardization also seems to keep reoccurring and is probably one of the most debated of the topics when discussing the creation of a cloud trade association. It is my opinion that that last thing the world needs is yet another standards body. There are dozens of existing groups and organizations that would be ideal partners. I say lets work with them. I would guess a good portion of the members of such a trade association would also have memberships in existing standards bodies already. I agree that there is no need to reinvent the wheel or boil the ocean. I'd rather see this association partner with the standards world rather then compete with it.

Why now? The fact is we're no longer talking about a hypothetical industry with almost 50 billion in real revenue last year we as an industry have a huge opportunity to collaborate and capitalize on potentially one of the biggest technology shifts we've ever seen. Let's not waste it.

Monday, October 19, 2009

Google's Data Liberation Front

In what can be seen as a major win for users of Google's various cloud services, the company has announced a new website called "The Data Liberation Front" dedicated to be the central location for information on how to move your data in and out of Google products.

According to the site, The Data Liberation Front is an engineering team at Google whose singular goal is to make it easier for users to move their data in and out of Google products. We do this because we believe that you should be able to export any data that you create in (or import into) a product. We help and consult other engineering teams within Google on how to "liberate" their products.

Their mission statement: Users should be able to control the data they store in any of Google's products. Our team's goal is to make it easier for them to move data in and out.

The site's creators point out that project was started as an internal engineering team back in 2007. When the team couldn't agree on the name, they came up with "Data Liberation Front" as a homage to The Judean People's Front, the splinter group in Monty Python's Life of Brian that spends most of its time bickering. In addition, the team indicated that they see themselves as being somewhat subversive, not so much within Google, but insofar as it's unusual for a big company to work to make it easier for their customers to leave them.

The site also points out that there shouldn't be an additional charge to export your data. Beyond that, if it takes you many hours to get your data out, it's almost as bad as not being able to get your data out at all. I would also add if your data isn't usable. For example a 1tb text file is (almost) just as bad as not getting your data at all.

The FAQ answers some interesting questions including that of Data Standards saying "We're working to use existing open standards formats wherever possible, and to document how we use those formats in a clear simple manner."

Personally, I applaud this move by Google, lets hope others in the cloud space follow Google's lead.

Ruv's Upcoming Cloud Travel Schedule

I am pretty busy over the next couple months with various Cloud Computing related travel. Below is an overview some of the key locations and events I will be speaking at. Please feel free to get in touch if you'd like to meet up in person or have me to speak at your conference / summit.

CSIM - Fall IT Executive Forum
October 27th & 28th - Los Angeles, California

Keynote Presentation: The future of Cloud computing
(Guests are welcome)


2-4 NOVEMBER 2009, Park Plaza Victoria Hotel, London

Keynote Presentation: Managing the Cloud - The Impact of Cloud Computing for Data Management Professionals


CloudCamp Tokyo
November 17th (I'll be in Tokyo 16th-22nd of November)

AREA SHINAGAWA Bldg. 1-9-36 Konan, minato-ku,
Tokyo, 108-0075 Japan
(2 min walk from Shinagawa Station)


IGT2009 World Summit of Cloud Computing
December 2-3rd (Tel Aviv, Israel)

CloudCamp Israel (Dec 2nd)

- Tel Aviv, Israel @ IGT

Sunday, October 18, 2009

Is IaaS (as a term) Doomed?

Interesting post over at by Joshua Beil, the Director of Market Strategy and Research for Parallels. In his post Beil asks a simple yet profound question. Is the term "IaaS" doomed?

He says "It’s not that the concept of “infrastructure-as-a-service” is flawed… it’s the acronym that is doomed. Let’s face it, “SaaS and “PaaS” can be said out loud, but when you say “IaaS” in same way, well, it just doesn’t work. I’m reminded about something my mother once told me about what happens when I assume."

He goes on to outline three options.
  1. Migrate from IaaS to CaaS and make Verizon’s day (CaaS is Computing as a service, and what Verison calls their IaaS service)
  2. Consolidate IaaS and PaaS into just PaaS, as the distinction between these two is getting blurrier as offerings from Amazon and Microsoft’s Azure evolve.
  3. Replace IaaS with another term that’s not a “C”… but what? I spent some time looking at synonyms for the word “infrastructure” but just didn’t see anything that worked really well.
Actually, I quite agree with Beil's assertion. As far as acronyms go, IaaS is about as bad as they get. The fact that your infrastructure is provided "as a service" is an obviously important aspect, but in reality it's not the only or most important driver when looking at implementing a cloud like infrastructure. API's and other various "web services" are quickly becoming pervasive, just about everything built recently is being provided as a service, or has some kind of web services available. And if by chance it doesn't, then you're probably going to steer clear anyway. I'd say things like scalability, elasticity, federation, application efficiency, metering/chargeback, self service access, open api's and system automation are just as important if not more. So I ask, how important is it that your infrastructure is described as a service?

At Enomaly we've chosen to use the "elastic computing" term instead of either IaaS or Cloud Computing directly. But like most, from a marketing point of view we do refer to both terms within our various materials. As a fad or marketing position, the term cloud computing is about as popular as they come with anything as a service not far behind.
Reblog this post [with Zemanta]

Embracing Low Performance Computing (LPC)

Had an interesting time in Banff last week at the Summit09 conference. The conference was a combination of events including the Cybera/CANARIE Summit, OGF 27 and IEEE Grid 2009 which focused predominately on the emerging opportunity for Cloud Computing with in the intersection of High Performance, Grid and Distributed Computing realms. The Summit brought the leading figures from around the globe including organizations such CERN as well as various other academics.

A few of the more interesting tidbits included, the discussions around the future of the Open Grid Forum. It seems that the OGF is currently going through a major transition as the Grid world is quickly distancing itself from the stigma surrounding the use of the term Grid or High Performance Computing. There were several conversations discussing whether the OGF should even continue calling themselves the Open Grid Forum with a few even suggesting the Open Cloud Forum might be more suitable name. Also notible was most of the marketing materials at the Summit simply refers to the OGF.

What I also found quite interesting was that there is little discussion on the topic of traditional high performance computing. But instead what everyone seemed to want to know was this thing called Cloud Computing and how it could benefit or improve their existing grid deployments. It was clear that the vast majority of attendees seem to realize that cloud computing isn't just another way to describe grid or distributed computing, but instead the opportunity to reimagine how they could address the shift to globalized web centric computing.

For the most part this reimagining seem to mean the movement from the traditional aspects of HPC to the notion of Low Performance Computing or LPC. Several times I heard the analogy of "I'd rather have something done now that takes a few days to complete then wait two weeks for something that takes a few hours to complete." These comments seem to hit at the heart of the opportunity. The new reality facing traditional grid centric architecture is that of efficiency and adaptability. The old grid computing systems and platforms tend to focus on one or two specifically optimized operations, but do little else. With the introduction of virtualization and infrastructure as a service platforms there seems to be a renewed excitement in the ability to quickly re-provision and rapidly implement self service applications.

Another comment I heard several times was the less these grid folks needed to involve their system admins, the better. The idea of humanless computing was a topic that kept coming up. The idea that the more tasks that could be automated, the less chance for human errors to be introduced into the various work flows. Simply humans seem to be the biggest obstacle facing grid related technologies.

All in all a very interesting week in Banff.
Reblog this post [with Zemanta]

Saturday, October 17, 2009

Anatomy of a Cloud Consultant

Earlier this week I was asked to participate in a cloud panel with a group of so called cloud experts. The panel focused on the state of the cloud industry. I have been on many of these cloud panels in the last year and have found it to be pretty vague what defines a "cloud expert".

So what is a cloud expert/consultant? First let's go to wikipedia. According to the site, in the broadest sense, "a consultant is a professional who provides advice in a particular area of expertise. A consultant is usually an expert or a professional in a specific field and has a wide knowledge of the subject matter. A consultant usually works for a consultancy firm or is self-employed, and engages with multiple and changing clients. Thus, clients have access to deeper levels of expertise than would be feasible for them to retain in-house, and may purchase only as much service from the outside consultant as desired."

So a Cloud Consultant is basically an "expert" in the realm of cloud computing. Someone who has a deep and broad level of experience and understanding of the problems introduced by moving to a cloud based environment. This sounds straight forward enough.

So how do you qualify a cloud expert? This is where things start to get complicated. First of all, unlike other areas of IT there is no professional certification for "cloud consultants". So choosing a professional cloud consultant or service firm is a matter of doing your due diligence. To help, I've compiled a brief check list of things you may want to look for when selecting your cloud consultant.

1. Experience - As in any profession, experience solving real world problems is probably more important then anything else. Has your potential consultant done anything of consequence? What other companies has your consultant worked with, what major obstacles have they solved and how? On the flip side, if they claim 10 years experience as a cloud consultant, dig deeper, how did this obvious previous experience related to what more recently has been referred to as the cloud? Some possibly answers may include experience in Grid or Distributed computing, building large multi-location data center architectures, load balancing schemes, web server clustering or other elastic methodologies.

John M Willis is prime example with extensive experience in related areas of expertise such as Enterprise Systems management. Using this related experience Willis has been able to transfer those skills built up over decades into a thriving cloud consulting operation.

I'd also keep in mind that cloud computing isn't something new, but instead the intersection of several existing technologies. Make sure your consultant has the right mix of experience in the areas that are of most concern to you and your business.

2. Code - Often consultants do very little more then make recommendations that others must implement. This can be useful, but more often running code is more useful. One of the best and easiest ways to find great cloud consultants is look for those consultants who have taken it upon themselves to create open source cloud related Projects. The Boto Project by Mitch Garnaat is a perfect example. Garnaat is a longtime AWS consultant, a doer who is an active community member on the AWS community discussion boards, he proved his worth by his actions in the community and producing a project that helps thousands around the globe. It also helps that he's been working with AWS since 2006.

3. Community Engagement - As I mentioned previously, community involvement is another great way to gauge experience. Places like the the AWS discussion boards, or various other discussion groups are ideal places to find those hidden gems. They also provide valuable insight into the capabilities of the given consultant in a public setting. Is your consultant a troll who picks fights or are they a helpful member of the community? A quick Google search and you'll have your answer.

4. Blogs & Whitepapers -Blogs have also become very useful ways to determine a cloud consultants vision and capabilities. Although they may not shed to much light on their actual experience they do provide a potential channel by which you could find a consultant.

Randy Bias a well regarded cloud consultant provides what he describes as a StrengthsFinder Report to help potential consumers in their selection. The report provides a review of the knowledge and skills acquired and can give a basic sense of your consultants abilities. According to Bias, the report provides insight into the natural talents of the consultant and can give true insight into the core reasons behind their successes and why you should select them.

5. Interview - Like any job, interview your consultant. Ask them questions that would gauge their qualifications. Start off by asking them the ultimate trick question, "what is cloud computing?". Good answers avoid the specifics of the technology but instead focus on the opportunities. Bad answers are things like saying "Salesforce" or "Virtualization" or "VMware".

Keep in mind if you ask a 100 people what cloud computing is, you'll probably get 200 answers. So if you are wondering how would I answer, the question? Here you go, this one is on the house. "Broadly I see cloud computing as a new method to market, management and deploy software and or infrastructure using the web. Or more simply -- web centric software and infrastructure."

You may also want to refer to specific definitions use things like the wikipedia definition or the NIST definition as your benchmark. If your consultant says according to NIST or uses other well regarded "cloud luminaries" that isn't necessarily a bad thing. Just make sure you agree with them. For instance, according to Larry Ellison may be good if your getting a job with an Oracle shop, but no so good for a Google App Engine gig.

6. References - Your only as good as your last job. So make sure to do your homework and ask the right questions. What did the consultant do, what problems did they solve, what technologies and platforms did they use and why was it a cloud project?

In closing, I do believe that a major obstacle to cloud computing consultants is the lack of accreditation. One possibly solution is to create an official professional cloud certification. One model could be similar to the IT Architect Certification Program provided by the Open Group. The Open Group certification program provides a framework for accreditation of third parties to establish IT Architect certification programs affiliated to The Open Group. The framework of accreditation and certification is specifically intended to standardize the process and criteria for IT Architect professional certification and establish a foundation for the required skills and experience necessary to achieve such a distinction. Basically, the Open Group has created a basic way for you to select someone with a standard level of knowledge required to preform the job of a IT Architect. Similarly, this could be applied to the job of a cloud consultant / Architect.

Good Hunting.
Reblog this post [with Zemanta]

Wednesday, October 14, 2009

Cloud Fail

My previous post about the Sidekick failure seems to have whipped up a bit of a frenzy around whether or not the Sidekick platform is an actual cloud service. On one side you people saying it isn't a cloud because it's not redundant or distributed or api accessible or whatever. On the other you have the media saying hey it's a web based service, so it's a cloud.

Whether the Sidekick platform is or isn't "cloud computing" is totally secondary to the real issue. The Sidekick failure has beautifully illustrated a major potential problem facing the use of any remotely hosted web services, cloud or otherwise and this is trust.

My issue with the sidekick cloud debate isn't whether or not it's a failure of cloud computing. You can't blame a buzzword. Cloud computing isn't any single technology but instead it's a new way to market, manage, deploy and operate web centric software and infrastructure. So I do agree it isn't a failure of cloud computing so much as a failure to build an adequate DR strategy among other things.

This failure does in the most simple terms demonstrate a key problem facing cloud computing, you are trusting someone else to manage your data / infrastructure. But leading an argument by saying it isn't a cloud because clouds can't fail is ridiculous.

Sunday, October 11, 2009

Cloud Computing is Dangerous

In about 25 years from now I can imagine having a conversation with my kids that goes something like this.

"Son, back in my day we used to store all our data on a single computer." My son in turn says, "Dad, that's crazy, I have every song I've ever listened to and every movie I've ever watched on my brand new ibrain, anytime anywhere" and I say "Worse yet, we had to return to that computer in order to access those files, in the snow, without shoes on..." (You get the idea)

Although I'm partly kidding, for most this how personal computing still works. Ask anyone who's ever lost a hardrive and they will tell you that your data is your life and for the most part your life is stored on a single computer. If you lose that computer, you lose, well, your data. (No dramatics sorry) This begs the question, wasn't the emergence of cloud computing supposed to help solve these types of problems? Isn't cloud computing supposed to be the answer to all our problems?

I'm here to tell you. Hell No! Cloud computing is Dangerous!

Helping bring this danger to the forefront was the announcement last week that a division of Microsoft ironically called Danger Inc had likely lost all the contacts, photos and other personal data for users of the T-Mobile Sidekick. Pretty bad, huh? What was worse is this cloud service was a manditory requirement for using the Sidekick service. If you wanted to use a Sidekick you had no choice but to use this sadly lacking excuse for a hosted data service.

Although many of the cloud pundants out there will try to tell you that the Sidekick service isn't a cloud application. Let's call it what it is, it's a cloud app -- your data when using a Sidekick is hosted in some elses data center. In the most basic terms, if I choose a device such as a mobile phone that requires me to use some elses data centers for storing my personal data, I expect it to be at the very least backed up automatically, and preferably I should have the ability to do so myself. It appears that neither was an option for T-Mobile Sidekick customers. This failure hits at the heart of why interoperability and data portability is so important. It comes down to bad things happen and I should have the ability to take the data that is mine if I choose to do so, easily.

Over the last few days I've read a number of articles that point out that this cloud failure means the end of cloud computing. Let me remind you that failures happen and it happen all the time. There are whole groups at major manufacturers devoted to it, on purpose. Whether it's on your desktop, in your data center or in the cloud. To fail is human. But to be prepared is noble.

The best and easiest way to be prepared for the inevitable failures that will occur is to rely on services that allow for portability. Make sure you have a clear exit strategy before you choose a cloud service provider and avoid the ones that attempt to lock you in. At the end of the day it's up to you to make sure you don't get Sidekicked (in the face or otherwise).

Again, for those of you effected by the Danger Inc failure, my deepest sympathies are with you, you deserve more. You deserve your life back, or at the very least your data.

Thursday, October 8, 2009

Canadian Government Unveils Cloud Computing Strategy & Whitepaper

This week I had the honor of organizing and hosting the first in a series of Global Government Cloud Computing Roundtables. This first event was held in Ottawa, Ontario and was coordinated in partnership with Jennifer Meacher of Canada's Foreign Affairs and International Trade(DFAIT) and held along side the GTEC conference.

The purpose of this by invitation meeting was to provide an international forum for leading government CIOs and CTOs to discuss the opportunities and challenges of implementing cloud computing solutions in the public sector. Representatives from the GSA's Office of Citizen Services and Communications as well as a variety of senior officials from various Canadian government departments were in attendance. Attendees were eager to share insights into the opportunities and challenges facing cloud computing both in the Canadian Government as well as more broadly. Needless to say it was a lively discussion.

Jirka Danek, the Canadian Government's CTO (Public Works) outlined a detailed strategy for Cloud Computing within the Canadian Government (Full text posted below). For those of you that are unfamiliar with Public Works Government Services Canada. PWGSC is similar to the General Services Administration in the United States with a mandate to be a common service agency for the Government of Canada's various departments, agencies and boards.

During Danek's presentation he pointed to cloud services such as the web site in the USA as a possible model to follow with in the Canadian government. He also indicated that the Obama administration has provided Canada with a strong role model in driving the adoption and the use of cloud computing within government. He also see's opportunities for developing new cloud centric policies for government agencies who still must segregate data and processes before they can be more broadly adopted into the cloud. Other areas he also raised concerns focused around interoperability, portability and the lack of standards as some of key aspects hurting government cloud adoption.

For me one of the more exciting parts of the day was when Danek unveiled a detailed strategy for cloud computing which I have the honor of sharing publicly below. (Download Avaliable here)

Cloud Computing and the Canadian Environment

Reblog this post [with Zemanta]

Monday, October 5, 2009

CloudCamp Announces “CloudCamp in the Cloud” – Virtual Unconference - Oct 22nd

CloudCamp, organizer of the community-based cloud computing unconference series, today announced that it’s taking its popular event series virtual with the forthcoming “CloudCamp in the Cloud CloudCamp in the Cloud, to be held Thursday, October 22, 2009 from 12 noon to 3 pm Eastern Standard Time, builds upon the original live CloudCamp format providing a free and open place for the introduction and advancement of cloud computing. Using an online meeting format, attendees will exchange ideas, knowledge and information in a creative and supporting environment, advancing the current state of cloud computing and related technologies.

There are a number of opportunities to get involved with CloudCamp in the Cloud:

  • ATTEND – Attending CloudCamp in the Cloud is free, fun and informative. Register now at
  • PRESENT – CloudCamp in the Cloud encourages community presentations. If you have a cloud-related topic to discuss, visit the page to submit a proposal.
  • SPONSOR – CloudCamp depends on corporate sponsors who provide financial assistance and other valuable donations. Current CloudCamp in the Cloud sponsors include Citrix, Enomaly and Appistry. If you would like to sponsor CloudCamp in the Cloud, please contact Reuven Cohen.
  • ORGANIZE – CloudCamp is a non-profit, volunteer-driven organization. If you'd like to help facilitate CloudCamp in the Cloud, letting us know about your interest by emailing [email protected].
  • SPREAD THE WORD – Help share the news about CloudCamp in the Cloud, by retweeting this announcement (hashtag: #cloudcamp), blogging about the event, and linking to the main information page at

Related Links
[1] [CloudCamp in the Cloud Registration]
[2] [CloudCamp on Twitter]
[3] [CloudCamp on Facebook]

  • Dave Nielsen, (415) 531-6674, dave -at- platformd -dot- com
  • Reuven Cohen, (212) 203 4734 x102, ruv -at- enomaly -dot- com
  • Sam Charrington, (415) 727-1850, sam -at- appistry -dot- com

About CloudCamp
CloudCamp was formed in 2008 in order to provide a common ground for the introduction and advancement of cloud computing. Through a series of local CloudCamp events, attendees can exchange ideas, knowledge and information in a creative and supporting environment, advancing the current state of cloud computing and related technologies. CloudCamp has served over 5,000 CloudCampers in more than 50 events all over world, in cities like Amsterdam, Antwerp, Bangalore, Berlin, London, New York, San Francisco, Stockholm and Singapore.
Reblog this post [with Zemanta]

Sunday, October 4, 2009

Cloud Peering for Service Providers

Been doing some thinking around some of the opportunities for cloud providers to provide the seamless ability to utilize other "compatible" cloud service providers capacity as a kind of "cloud overdraft" protection. So it occurred to me, the concept already exists and is a core part of how the Internet already functions. Yes, It's called "Peering", I'm calling my little spin on this concept "Cloud Peering"

Wikipedia describes Peering as "a voluntary interconnection of administratively separate Internet networks for the purpose of exchanging traffic between the customers of each network." Now replace "Internet Networks" with Public cloud service / hosting providers and you start to see the opportunity.

Generally Peering relationships involves two more networks coming together to exchange traffic with each other freely, and for mutual benefit. But in the case of Cloud computing, instead of traditional user traffic, on demand cloud capacity can be made available in bulk or by metered usage.

Some other Cloud Peering motivations could include.

  • - Cloud Service provider Overdraft protection aka Cloud Bursting (Smaller hosting providers seamlessly overflowing to larger ones, Random small cloud provider Inc, bursts to AT&T cloud through whitelabel agreement)
  • - Increased redundancy (by reducing dependence on one or more cloud providers).
  • - Increased capacity for extremely large amounts of traffic (distributing traffic across many cloud providers).
  • - Increased routing control over your traffic. (Sudden spikes from London? No problem, scale using UK cloud resources)
  • - Improved perception of your network (being able to claim a "higher tier", mostly for marketing purposes, possibly QoS or SLA related).
  • - Ease of requesting for emergency aid (from friendly peers, when sh*t hits the fan).
Also, following the same model as traditional Peering, Cloud Peering could follow one of the following three categories:
  • - Transit (or pay) - You pay money (or settlement) to another network for Cloud access (or transit).
  • - Peer (or swap) - Two networks exchange traffic between each other's customers freely, and for mutual benefit.
  • - Customer (or sell) - Another coud pays you money to provide them with Cloud access.
Again, just random thought with a little help from wikipedia.
Reblog this post [with Zemanta]

Friday, October 2, 2009

Confessions of a Google Wave Fanboy

I have a confession to make, I am addicted. I can't stop, I've tried but I fear I am hooked. I am hooked on Google Wave. Quite possibly the coolest damn real time application platform I have ever seen. Yes, I know what you're thinking. It's true.

Like many who were given early Google wave sandbox accounts, I didn't see the purpose at first. The rather buggy javascript laden interface was actually kind of slow and at times cumbersome and worst of all crashed my browser all the time. This was for the most practical of reasons, I didn't know anyone else using the platform and it was an alpha that was changing on practically an hourly basis.

Then a few weeks ago something changed, suddenly I started to see public waves using the "with:public" functionality. All at once the value of the platform became completely apparent. The "with:public + keyword" allows you to both search for public waves as well as create real time topic listings of wave discussions. For example, the search "with:public tag:cloud computing" creates a listing of any discussions on the topic of cloud computing. This in essence completely replaces the need for a traditional mailing list or Google group. And best of all it's in real time. You can also link directly to your public waves, every thing in wave has a url.

The urls control all aspects of the interface, here is an example of a full screen wave url for "waverati" discussion I created. Basically it goes like this, interface_config:searchvar:waveserver:waveid,minimized:contact,minimized:search:by%253Ame,!w%252BQJdniN4WN.1

So what is wave? Well let's just say it's hard to describe what wave is exactly, other then it's a lot of things all at once.

So here is my initial stab.

Google Wave is combination of Email, Twitter, IM, Wiki, iGoogle, Google Groups and the Facebook Apps Platform mashed up in real time.

For those who are Lucky enough to be included in the beta, please feel free to add me.

I've also included a few of my favorite wave bots;

Polly the Pollster ([email protected])

Creates and distributes multiple choice poll questions.

RSSyBot ([email protected])

Adds an RSS feed to Wave.

TwitUsernames ([email protected])

Links @usernames to

Blog bot ([email protected])

Publishes waves to blog posts.

Swedish Chef ([email protected])

Bork! Bork! Bork

Reblog this post [with Zemanta]

#DigitalNibbles Podcast Sponsored by Intel

If you would like to be a guest on the show, please get in touch.