Thursday, December 31, 2009
First, I will admit, I am among the group of cloud advocates who routinely claim that cloud computing is green, I say this without any proof or evidence to support my statement. I make this claim as part of my broader pitch to use Cloud Computing, I say this as a sales and marketing guy, but not as an advocate. As an advocate I'd like to have some empirical data to support my position. Believe me, I've searched, and I've searched -- although there are piles of forecasts about the potential market for cloud computing, said to be in the billions, little exists to support the green / eco-friendly argument.
On the face of it, a major incentive to move to cloud computing is that it appears to be more environmentally friendly compared to traditional data center operational / deployment models. The general consensus says that reducing the number of hardware components and replacing them with remote cloud computing systems reduces energy costs for running hardware and cooling as well as reduces your carbon foot print while higher DC consolidation / optimization will conserve energy. But a major problem still remains, where is the proof?
The problem is there is no uniform way to measure this supposed efficiency. None of the major cloud companies are providing utilization data, so it is not possible to know just how efficient cloud computing actually is -- other then it sounds & feels more green.
Tuesday, December 29, 2009
5. Keep Giving Back. For anyone who knows me, knows that I am a doer. From CloudCamps, to Cloud Interop, from Advisory boards to mentorships. I will continue to give back whenever and where-ever I can. If there is an itch, I will do my best to scratch it.
4. Keep pushing myself. Lately, I've come to the realization that my only limit, is the limitation of my imagination and in that I will continue to push the boundaries of my own limitations. Whether it's ideas or ambition, house work or being a Dad. I will strive to be the very best man I can.
3. Make More Friends. As someone who works as much as I do, there seems to be a fine line -- scratch that, actually, no line between work life and social life. Although I enjoy going out for drinks with random people in random cities. I believe that having close friends, the kinds that you can count on is important. I will work harder at being a better friend to my current ones and make some new friendships along the way. I will do this without being too creepy about it ;)
2. Make more money. I'm not sure if you're supposed to include this type of goal, but unlike wishing on a star, which probably should be selfless, as an entrepreneur I think you must focus on the obvious reason you started your business in the first place, to make money. If Enomaly was a sports team I would characterize 2009 as a rebuilding year, where we focused on developing both our product line as well as our network of partners and customers around the globe. In direct contrast, 2010 will be the year all this hard work pays off. I resolve to make as much money as possible in the most moral, ethical and environmentally friendly ways possible.
1. Spend more time with my family. Yes, this may seem obvious for some, but for someone who has spent the previous 12 months traveling across the globe covering more then 100,000+ miles it's also the most important of all. Keeping in mind my son was born January 6th 2009 and I have another on the way due this June, I must do my best to find that work / life balance. And yes, it's easier said then done.
Tuesday, December 22, 2009
According to Wikipedia, "Economies of scope are conceptually similar to economies of scale. Whereas economies of scale primarily refer to efficiencies associated with supply-side changes, such as increasing or decreasing the scale of production, of a single product type, economies of scope refer to efficiencies primarily associated with demand-side changes, such as increasing or decreasing the scope of marketing and distribution, of different types of products. "
So my hypothesis is that beyond just the money these companies bring to bear, they also bring the power of a large and well developed sales & marketing channel. In effect they can bundle a variety of applications and services to a well established network of customers and partners. Regardless if your product is better, if your brand, marketing and sales channels are strong enough you can sell anything. (Microsoft owns the Desktop, Google Owns the Net, both have key advantages) If one of your product lines falls out of fashion the company will, most likely, be able to continue operating because of diversification. It's been one of Google's key advantages, they can try out a million product directions, because they have the customer visibility and money (thanks for search world ads) to do so in the hope that one of these random products will revolutionize both their business and hopefully the world.
This may also be why often a product sold to Oracle, IBM, Google or Microsoft is more successful then one left independent. Of course there are aways exceptions to these rules. These very market leaders themselves at one point were all new entrants in the market and had to endure to overcome the incumbent players. But as a generally rule of thumb it certainly seems to give the incumbent a huge advantage. So the key to success may just be to be to outlast your competitors and continually try new things.
Just a random thought.
Thursday, December 17, 2009
In a sense they're subsidizing the infrastructure costs for mobile application developers they work with. They are basically covering the costs associated with the more routine aspects of mobile app development while also empowering a new and broader group of potential partners by providing a quick and easy way to develop applications for their environment. Another advantage is in gaining a greater pool of potential network specific applications & developers. Very smart.
For me this use of a private partner cloud represents a great example of the opportunities for offering free cloud based IaaS services specifically for your partners, suppliers and best customers. The free to partners model for cloud computing may be the next logical step for cloud computing. Large technology companies like Microsoft, Intel, IBM, Oracle and others may start using these partner clouds as part of their channel and developer programs. If you develop for our platform, we'll give you everything you need to do so, free of charge. This could include everything from compute to storage, development to deployment.
Monday, December 14, 2009
The philosophy of JIT is simple: inventory is waste. The idea behind a JIT strategy is to improve a business's return on investment by reducing the associated carrying costs associated with under utilized assets, this could be a toaster sitting in a warehouse or a hosting company's unused server. The Data Center business is in a lot of ways is very similar, the more unused rack space the less you are making. Cloud centric data centers make this problem even worse, not only do you need to have excess data center space, you now need to have physical hardware in place, just in case your demand spikes. For a lot of larger players this means un-utilized compute capacity is making you nothing.
The folks at Amazon Web Services have come up with a very interesting approach to solve the problem of DC carrying costs by implementing a spot pricing scheme for unused EC2 instances. In case you're not familiar with the concept, wikipedia describes the spot price of a commodity as the price that is quoted for immediate (spot) settlement (payment and delivery). In securities, the term cash price is more often used.
The new service is called Spot Instances and allows you to bid for one or more EC2 instances at the price you are willing to pay and more importantly the minimum price AWS is willing to accept. The Spot Instance request consists of a number of parameters including the maximum bid that you are willing to pay per hour, the EC2 Region where you need the instances, the number and type of instances you want to run, and the AMI that you want to launch if your bid is successful. It's a kind of cross between an arbitrage, auction and on demand web service. The concept of spot pricing does have some challenges. Depending on the item being traded, spot prices can indicate market expectations of future price movements in different ways. For a security or non-perishable commodity (e.g., gold, Compute Capacity), the spot price reflects market expectations of future price movements. It will cost more in December because there is more demand, then it will in July, so you buy at July prices and Sell it in December. In theory, the difference in spot and forward prices should be equal to the finance charges, plus any earnings due to the holder of the security, according to the cost of carry model. In finance this is known as arbitrage -- the practice of taking advantage of a price differential between two or more markets: striking a combination of matching deals that capitalize upon the imbalance, the profit being the difference between the market prices. It's complicated, let's just say this is how Enron made it's money.
This new spot pricing approach may open the door to EC2 capacity squatters who buy up all the excess compute capacity and in turn sell it to at a higher cost, but still lower then the market cost for a traditional EC2 instance. This would be a practical approach for the costly larger instance types.
The potential for misuse does seem to be something Amazon has already put some thought into saying in a recent blog post. "As requests come in and unused capacity becomes available, we'll evaluate the open bids for each Region and compute a new Spot Price for each instance type. After that we'll terminate any Spot Instances with bids below the Spot Price, and launch instances for requests with bids higher than or at the new Spot Price. The instances will be billed at the then-current Spot Price regardless of the actual bid, which can mean a substantial potential cost savings versus the bid amount." To me this says they will have a preference toward higher bids.
The post goes on to outline; "From an architectural point of view, because EC2 will terminate instances whose bid price becomes lower than the Spot Price, you'll want to regularly checkpoint work in progress. (Meaning you may lose your EC2 instances if a better rate comes along) Many types of work are suitable for this incremental, background processing model including web crawling, data analysis, and data transformation (e.g. media transcoding). It wouldn't make much sense to run a highly available application such as a web server or a database on a Spot Instance, though."
So what does Spot pricing mean to the IaaS world? For one, we may for the first time start to see compute capacity treated in the same way traditional commodities are with the emergence of an active secondary market for compute capacity. Very exciting times, lets just hope we don't see another Enron.
Check out the new EC2 spot instance service here >
Thursday, December 10, 2009
Before I do, as anyone who routinely reads my blog will understand, all I pretty much do is attempt predict the future.
As an entrepreneur that has always been a key part of my successes & failures. (That and I also seem to be an eternal optimist) Generally my view of the future is not shaped by selecting any particular point in time but instead done from what I see from my ever changing vantage point in the present.
Before I dive into my predictions, I first must give you my ideology. It is my belief that before you can predict the future, you must first understand the past. In turn by understanding the past you are able to visualize your ideal future and more importantly the way to get there. The future is not predetermined, but rather guided by the decisions we as global collective make today. (Cheesy, but hey -- I'm predicting the future.)
Anytime Data - Real Time, Anytime and Anywhere
As we continue our long march into the world of Cloud Computing and Internet centric applications in 2010 I believe that real time information (data) will be the most important asset any business large or small can have. With the sudden influx of Cloud resources those who learn to tap into this wealth of data and do so the most efficiently will ultimately succeed.
The one thing that Moore's Law and software development has taught us over the last 30 years is the more compute resource we have available the more we use. I see this holding true, except now we're not limited to any single CPU or Data Center. The future of computing will be about the speed in which we can make decisions (data analysis). This will be enabled by on an endless supply of real time information being gathered by a worldwide network consisting of both human and automated sources. The world has become one giant computer network with the Internet the glue that holds it together.
As I've written about previously, I believe that the biggest technological business opportunities are not found in the established western countries, but instead are found in the new crop of upstart economies in regions such as Asia, South America and even Africa. The primary reason being these emerging economies have large population bases and more importantly they don't have the legacy infrastructure that most Western economies suffer from. These regions offer in a very real sense a greenfield opportunity. These (fast growing) emerging economies have an opportunity to choose the latest & best technology solutions without regard for how it may effect legacy systems -- since there really isn't any. In 2010 as we emerge from the recession I believe that we'll start to see these regions quickly become the brightest, biggest and fastest growth opportunities. Help equip these economies and you'll equip yourself for a profitable future.
Lately I've come to understand that beyond just being a buzzword, for me Cloud Computing has come to represent the convergence of many technologies. A kind of technological evolution where many existing IT systems, processes and applications have come together -- brought about by the Internet as both an operational as well as a delivery model.
This may be obvious to some, but in 2010 I see the use of Cloud Computing continuing to be developed and utilized in many different more radical contexts. Things we've never thought possible are now being made possible by the rapid advancements being brought about by near limitless access to compute resources. Any one individual in any one basement is now able to compete with the largest companies. What we're seeing for the first time are the barriers to large scale, compute intensive innovation being made available to all. It won't be those with the most money who win, but instead those with the best ideas.
Monday, December 7, 2009
First in regards to Open Cloud Services, basically the concept goes like this; as we move away from the traditional client/server based models of the past to more web centric / service oriented opportunities of the future, we will see open source shift from application centric (source code) toward free open services and information. Cloud providers will essentially give away access in return for greater adopt of their platforms / services, increased customer acquisition and to accelerated creation of data and information. Basically the same reasons companies open source their applications today, just applied in a cloud context.
His comments really did get me thinking and reminded me of a potentially huge but generally overlooked opportunity for Cloud Computing. What I'm talking about is that of the "Community Cloud" or "Cloud Cooperative" which may be a potential avenue to enable these types of free or shared cloud services.
For those of you unfamiliar with the Community cloud concept, NIST defines it as "a cloud infrastructure that is shared by several organizations and supports a specific community that has shared concerns (e.g., mission, security requirements, policy, and compliance considerations). It may be managed by the organizations or a third party and may exist on premise or off premise."
For me the concept of a community cloud represents an opportunity to create shared pools of compute resources which could be made freely available among a diverse but related group of contributors. This could be for compliance reasons such as HIPAA in the US, geographical or economic rationale. The idea of a community cloud is a logical offshoot of the more traditional aspects of a co-operative (Co-op), a concept that dates back as long as human have been organizing for mutual benefit (expressed today as "profit-sharing" and "surplus sharing" arrangements). Historically Coop's were organized as cooperative structures, allocating jobs and resources among each other. The concept of a cloud co-op seems fairly well suited for the smaller cloud hosting & service providers who are now forced to compete with global multi-billion dollar cloud competitors. In a sense a Cloud Co-op provides the power to group many smaller independent cloud operators together creating a much stronger organization then any single contributor could hope accomplish on their own.
For example a group of European cloud providers could all agree to pool their resources and seamlessly offer capacity to each other. In someways this is already happening within the broader research realms such as the various grid & HPC organizations (i.e CERN). The key difference is applying this model to for profit businesses who need to compete against the larger players.
Back to the idea of open services, I agree we very well may be moving toward the free (as in beer) model for cloud providers as a method of customer adoption in the same way free software has helped in the adoption of traditional software. Although a major issue still remains. The biggest problem in providing free cloud resources, like in most other areas of open source is how to eventually monitize it. Right now it seems the quickest route to the monitization of free is to sell (your business, and your users) to some larger organization, making it their problem. Open source is a great tool in an established market (MYSQL Vs Oracle, etc), but in an emerging market it has the potential to cause more harm then good potentially driving the price to zero. Which in the long run isn't sustainable. But on the flip side, near zero cost capacity does open a wide range of other potential applications and usages for things we probably haven't even consider yet. I call it the twitter business model, build it, grow it, then figure out a way to monitize it.
Sunday, December 6, 2009
Before I go into the details of my trip, I first need to give you some background which lead to my bizarre series of events. Although I was born in Haifa, a city in North Israel, I had left the country in 1982 at the age of 4, moving with my parents to Canada. Over the nearly 30 or so years since I left I have been lucky enough to travel all over world with generally little in the way of problems. Regardless of where I travel I've always use my Canadian passport, generally the Canadian passport provides me with a warm welcome regardless of the country I'm visiting. As an individual I've always identified myself both professionally and personally as a Canadian. When I speak, I like many other Canadians throw in the casual "eh" at the end of sentences, and Americans routinely make fun of my "outs" and "abouts". I'm told they sound funny. So for all practical purposes, I am Canadian.
But because of where I was born, in the back of my mind I knew I was technically an Israeli citizen but never gave it much thought. Being born in Israel to a Swiss mother and Canadian father gave me a unique gift. This unusual "gift" is that of having three citizenships. Two of which, Israel and Switzerland require military service. Since leaving Israel at the age of 4 I have never had the opportunity to go back, not so much as a conscience decision as much as I never really had any reason to visit -- albeit for business or otherwise. But unlike Israel I have been to Switzerland many times over years and even have an active Swiss passport (which I rarely use). During my many trips to Switzerland, I have never been asked about military duty, so I falsely assumed the same would be true in Israel. Making what transpired all the more surprising.
Back to my arrival in Israel, at first I thought "Wow, Avner and the folks from the Israeli Association of Grid Technologies (IGT) who had invited to speak at their annual summit really go all out. I hadn't even gone through passport control and I'm already being greeted with a warm welcome". Well it turns out the welcome wasn't as warm as I thought. Next thing I know I'm being escorted to a secret label-less backroom at the airport. At this point I was told to wait. So for about two hours I waited as occasionally attractive young Israeli women with large machine guns would come in saying something to me in Hebrew, which I don't speak. After awhile they realized I didn't speak Hebrew and said "What kind of Israeli doesn't speak Hebrew" To which I responded, "A Canadian" They then ask me a series of questions. (Who my parents were, where I was born etc. Which they already knew)
The next part caught me by surprise, remember this is supposed to be a short (72hour) trip to Israel. A young woman tells me that as an Israeli citizen I have two conditions before I can leave: First I can't leave the country without permission from the dept of Interior and must get an Israeli passport. When I asked how long she tells me several weeks. Then the best part, secondly I must report for my Israeli military service in a place called Tiberias not far from Jordan and Syria on western shore of the Sea of Galilee as soon as possible. When I said again that I was just visiting, the official indicated that I was now officially in the Israeli defense forces (IDF).
I was now on my own in a country where I didn't speak the language and certainly didn't identify myself with. I was on my own effectively drafted into one of the most well funded and active defense forces on the planet. To give you some background on the Israel Defense Forces (IDF), in 2008 Israel spent $16.2 billion on its armed forces, making it the country with the biggest ratio of defense spending to GDP as a percentage of the budget of all developed countries.($2,300 per person). Also all male citizens are required to serve three years in the IDF with exceptions made only on religious, physical or psychological grounds. Arguably the IDF is one of the most politically charged defense forces on the globe, not exactly how I envisioned spending my next three years.
It wasn't that I was afraid of being in the army, so much as the thought of potentially being away from my family in what most certainly felt like a strange foreign land. With an 11 month old baby at home and my wife and I expecting another I focused on how to get out of this most awkward predicament I suddenly found myself in. So now instead of focusing my attention on the business meetings and presentations I was supposed to have over the next few days I would have to focus on what felt like getting back my freedom. Luckly my new Israeli friends and business partners stepped up to help me out.
Some of the biggest help came from an Israeli business partner (who asked not to be named). When I eventually emerged that evening from the holding area in the Airport, he was there waiting for me and sprung into action. Within minutes he had called senior contacts within the Israeli Government, contacts that would eventually include the Deputy Prime Minster of Israel as well as various other high ranking officials. He then detailed a strategy that would have me visit both the Dept of the Interior as well as the biggest Army base in the country.
While my business partner was calling everyone he knew, the second day of my trip I attendeed the conference as much as I could. After all I was in Tel Aviv for the World Cloud Computing Summit and a CloudCamp Tel Aviv which ended up being both successes having great turn outs. Needless to say there is a tremendous amount of interest in cloud computing in Israel with several hosting companies announcing they would be offering cloud related products and services. But alas, this aspect of my trip was greatly overshadowed by my worries of being conscripted into the military as well as not being able to leave the country. Anyone who follows my twitter account could easily see I was somewhat stressed over the situation. But thanks to the huge outpouring of support from the Israeli's I met, my situation would soon be resolved with the greatest of efficiency. Literally dozens of people made phone calls and provided me with advice. It seemed that if you had a friend in the IDF, they would call on my behalf with at one point one senior military commander noting that that I must of been a very special person because he had received no less then 10 calls about me in the previous 24 hours.
All in all it took roughly 48 hours to get my situation fully resolved. First with the issuing of an Israel passport (which was given to me 45 minutes after it was requested, a new record I'm told) as well as a visit to the largest military base in the country called Camp Rabin named for Yitzhak Rabin. The base was one of the first IDF bases and has served as the IDF headquarters since Israel's founding in 1948. Think of it like the Pentagon in the U.S.
One of things that struck me at Camp Rabin (other then it reminded me of a good unconference name) was the age of the average enlistee, somewhere between 18-21 years old, unsurprisingly all of which were heavily armed. It felt like a summer camp with guns.
After a few hours of back and forth between the IDF HQ and my outpost in Tiberias I was given my release papers. The papers were in Hebrew, but luckily my local partner who seemed to have became both my chauffeur and translator was there to help. He told me that I had been discharged from the IDF for the reason of "Old Age" and that it also said that I was free to leave the country.
Yes, one crazy business trip.
Sunday, November 29, 2009
Specifically, "Just as a number of local or regional companies provide both electricity and gas, independent telephone companies would be encouraged to provide both telephone and information utility services in their respective territories"
The original copy of this intriguing document resides in the Smithsonian National Museum of American History, Lemuelson Center for the Study of Invention & Innovation, in the Western Union Telegraph Company Records archival collection covering the years 1820-1995.
Here is the complete text.
1965: Western Union's Future Role-as the Nation's First Cloud Utility
Thursday, November 26, 2009
First let's look at scaling out, or to scale horizontally which basically means to add more nodes to a distributed system, such as adding a new servers or storage (which is easier). These could be in the form of physical or virtual servers. An example might be scaling out from one web server system to many dedicated slaves machines. Google has made an art form of scaling out. They have data centers around the globe geared toward this one core task - just in time hardware provisioning, but for most this is a very difficult and costly endeavour. Virtualization makes this sort of instant replication & provisioning of many virtual machines much easier.
Next is scaling up or the ability to scale vertically which means adding resources to a single server in a distributed system. Typically this involves the addition of CPUs or memory to a single virtual server in the form of Virtual CPU and RAM. Unlike a physical server, in a virtual environment you can change your virtual hardware characteristics, a physical server is what it is. You run at it's maximum potential limiting it's ability to easily scale up. If you need more scale you need more hardware or have to manually add more components to the physical server (RAM, CPU, storage, etc), which means downtime while the servers are upgraded. In virtual environment this isn't a limitation and can often be done on the fly.
Vertical scaling of existing systems also enables you to better leverage Virtualization technology because it provides more resources for the hosted Operating system and Applications that can share these resources in a multi-tenant environment. Virtualization also allows for more automated programmatic control of the system resources in correlation to the demands placed on the infrastructure or application being hosted. This is because in a virtual infrastructure you are not managing any actual physical components but instead virtual representations of them.
So it is very true that virtualization isn't a requirement of a cloud infrastructure, it just makes it a heck of lot easier to manage and scale out or up or both.
Wednesday, November 25, 2009
Hotel Seoul Kyoyuk Munhwa HoeKwan
202, Yangjae-Dong, Seocho-Gu, Seoul
13:55PM Introduction & Networking
14:10PM Lightning Talks (long form)- chaired by Chan-Hyun Yoon, KAIST (30 min each)
15:10PM Coffee Break
16:30PM unPanel / Breakout Discussion - chaired by Yang-Woo Kim (Dongkuk University)
17:30PM Wrap Up, Dinner & Networking
- KCSA (Korea Cloud Service Association)
- KISTI (http://www.kisti.re.kr/english/index.jsp)
Monday, November 23, 2009
Let me point of a few the more interesting points of my trip to the land of the rising sun. As I mentioned in my previous post about the opportunities for Cloud Computing in Asia, if my schedule is any indication of the demand for cloud products, there is a tremendous amount. Every minute of my trip was accounted for with non-stop meetings. I will also point out that the Japanese know how to entertain. As you can probably tell, I do a lot of traveling and am quite frequently taken to fancy restaurants, nothing comes close to the fine restaurants of Tokyo. Duck Sashimi anyone?
As for CloudCamp Tokyo, it was well attended with more then 160 in attendance. One of the more interesting aspects of the Camp was how the Japanese interact in an unconference setting. To put it simply, they don't. Getting them to publicly speak was a challenge. A few ask questions, but generally it was a one way conversation. I spoke, my translator spoke. The lightning presentations were also very well received. After the main unconference is when things got interesting. We had an open bar which probably helped loosen things up a bit. In an orderly single file fashion, almost everyone of the 160 or so attendees proceed to introduce themselves to me, handing me their business cards, with both hands, followed by a bow and a Hajimemashite (a polite 'Hello, I am pleased to make your acquaintance' which you only use the very first time you meet).
I also found it interesting that language and cultural differences are major barriers. Unlike Europe where most business people speak English, this is not the case in Japan. Most don't. To get around this we worked with a large Japanese System Integrator which provided us with two very nice Japanese translators (Eno-san and Maki-san - pictured) . The firm also provided us with introductions to most of the major Japanese cloud customers including the top Hosting, Data Centers, Telecoms etc. Without the help of the SI we would have had much more difficult time, a good portion of our meetings involved our translators doing the majority of the talking. So my suggestion to any company looking to sell cloud products and services to the Asian market is to find yourself a local partner who can act as a guide to the local business scene.
All in all a succesful week in Japan. Next week I'll be in Tel Aviv at the The World Summit of Cloud Computing. Should be interesting.
(P.S) Wear a suit and tie.
ENISA supported by a group of subject matter expert comprising representatives from Industries, Academia and Governmental Organizations, has conducted, in the context of the Emerging and Future Risk Framework project, an risks assessment on cloud computing business model and technologies. The result is an in-depth and independent analysis that outlines some of the information security benefits and key security risks of cloud computing. The report provide also a set of practical recommendations.
A few highlights of the report include:
- The Cloud’s economies of scale and flexibility are both a friend and a foe from a security point of view. The massive concentrations of resources and data present a more attractive target to attackers, but cloud-based defences can be more robust, scalable and cost-effective. This paper allows an informed assessment of the security risks and benefits of using cloud computing - providing security guidance for potential and existing users of cloud computing.
- Scale: commoditisation and the drive towards economic efficiency have led to massive concentrations of the hardware resources required to provide services. This encourages economies of scale - for all the kinds of resources required to provide computing services.
- Architecture: optimal resource use demands computing resources that are abstracted from underlying hardware. Unrelated customers who share hardware and software resources rely on logical isolation mechanisms to protect their data. Computing, content storage and processing are massively distributed. Global markets for commodities demand edge distribution networks where content is delivered and received as close to customers as possible. This tendency towards global distribution and redundancy means resources are usually managed in bulk, both physically and logically.
STANDARDISED INTERFACES FOR MANAGED SECURITY SERVICES: large cloud providers can offer a standardised, open interface to managed security services providers. This creates a more open and readily available market for security services.
LOCK-IN: there is currently little on offer in the way of tools, procedures or standard data formats or services interfaces that could guarantee data, application and service portability. This can make it difficult for the customer to migrate from one provider to another or migrate data and services back to an in-house IT environment. This introduces a dependency on a particular CP for service provision, especially if data portability, as the most fundamental aspect, is not enabled..
ISOLATION FAILURE: multi-tenancy and shared resources are defining characteristics of cloud computing. This risk category covers the failure of mechanisms separating storage, memory, routing and even reputation between different tenants (e.g., so-called guest-hopping attacks). However it should be considered that attacks on resource isolation mechanisms (e.g.,. against hypervisors) are still less numerous and much more difficult for an attacker to put in practice compared to attacks on traditional OSs.
MANAGEMENT INTERFACE COMPROMISE: customer management interfaces of a public cloud provider are accessible through the Internet and mediate access to larger sets of resources (than traditional hosting providers) and therefore pose an increased risk, especially when combined with remote access and web browser vulnerabilities.
Read the Complete Report Here >
Monday, November 16, 2009
Today, at the Microsoft Professional Developer Conference (PDC) in Los Angeles, Microsoft announced the release of version 4.0 of the.NET Micro Framework, but also that they are open sourcing the product and making it available under the Apache 2.0 license, which is already being used by the community within the embedded space.
The .NET Micro Framework,a development and execution environment for resource-constrained devices, was initially developed inside the Microsoft Startup Business Accelerator, but recently moved to the Developer Division so as to be more closely aligned with the overall direction of Microsoft development efforts.
Thursday, November 12, 2009
One of the more interesting side effects of creating the CloudCamp series of events around the globe has been as a market research vehicle. As interest in Cloud Computing increases in various geographic regions, so does the interest in folks on the ground who want to help organize local CloudCamp events. This network of local organizers has become an invaluable resource into new markets. These events have also done a tremendous job of forecasting potential high growth markets and more importantly the opportunities for Cloud computing within various emerging markets. And lately it seems that by far the largest opportunities are coming from one particular region of the world.
To give you some background, we have an upcoming CloudCamp next week in Tokyo (November 17th) organized by NTT among others as well as next month in Seoul, South Korea (Dec 16th) organized by the Korea Institute of Science and Technology Information and the newly formed Korea Cloud Service Association. The Japanese, South Korean and Chinese markets have been particularly strong for CloudCamp. Based on the this interest, we will also be doing a series of CloudCamp's in China (Shanghai, Beijing and Hong Kong), which will mostly likely take place in early 2010. (If you're interested in sponsoring one of these events, please get in touch)
As a more personal example, I will be in Tokyo next week for a CloudCamp Tokyo event on Tuesday as well as a number of business meetings. Purely from a demand point of view, from the moment I get off the plane on Monday until I leave on Sunday, I have non-stop meetings from 9am through dinners late into the evening every night of the week with various Japanese firms looking to capitalize on the booming Cloud Computing sector. We've seen so much interest from Japan that we've started to have to turn down meeting opportunities. To say the least, the interest in "Kumo" Japanese for cloud is astounding.
We've seen similar levels of interest in China as well where there seems to be a technological renaissance occurring. China is a very unique place when it comes to Cloud Computing. First of all they don't have the legacy infrastructure that most Western economies suffer from. It's in a sense a greenfield opportunity where the Chinese have the opportunity to choose the latest & best technology solutions without regard for how it may effect legacy systems -- since there really isn't any.
For instance, look at the massive adoption of mobile phones over the last several years, the traditional landline was almost completely bypassed for the newer and more efficient mobile options. Computing is also seeing a similar bypass, with projects such as national wifi networks being built in conjunction to a masssive multi-billion dollar national railway system. The Chinese seem to have realized that a national infrastructure is more then just a physical one, but also virtual.
I'm not alone in making this conclusion about the Asian market, In a recent report, Gartner said infrastructure software will account for 64.4 percent of overall enterprise software spending in the Asia-Pacific region next year, with APAC enterprise software spending to grow 10.2% in 2010 - the fast growth in any of the various global software markets.
Following upon the same sense Amazon Web Service has just announced an expansion into the Asian region in the first half of 2010. Saying "AWS customers will be able to access AWS’s infrastructure services from multiple Availability Zones in Singapore in the first half of 2010, then in other Availability Zones within Asia over the second half of 2010. AWS services available at the launch of the Asia-Pacific region will include Amazon EC2, Amazon S3, Amazon SimpleDB, Amazon Relational Database Service, Amazon Simple Queue Service, Amazon Elastic MapReduce, and Amazon CloudFront."
“Developers and businesses located in Asia, as well as those with a multi-national presence, have been eager for Asia-based infrastructure to minimize latency and optimize performance,” said Adam Selipsky, Vice President of Amazon Web Services. “We’re very excited to announce the expansion of AWS infrastructure into Asia to help our customers plan their technology investments and better serve their end-users in Asia.”
Tom Lounibos, CEO of SOASTA had an interesting comment on the opportunity in a twitter post earlier saying "AWS announces Singapore site 7 hours ago, and I wake to three SOASTA customer requesting Cloud Testing from Singapore! "Demand" wins!"
Although I am just one man from just one company I believe that in some small way that both Enomaly and CloudCamp represent the tip of the iceberg when it comes to the opportunity to offer Cloud Computing related products in service to the Asian Market and from where I sit there is no bigger opportunity then in Asia.
Monday, November 9, 2009
ECP is a carrier-class architecture & cloud hosting platform which supports the deployment of very large public cloud infrastructure for service providers. The platform has been designed to span multiple federated data centers in disparate geographies around the globe handling hundreds of thousands of VM's and multi-tenant customers.
This version of ECP Service Provider Edition brings the follow enhancements over 3.0.2:
- KVM is now directly supported as a hypervisor at install time.
- Sample data is installed during initial installation, so there is no need to create a customer/group/permissions before testing the system. See INSTALL for default user/pass.
- VNC window in customer UI is now identical to the Admin UI. Passwords for the VNC console are now found under Info button at VM level.
- Info window now shows how to connect with an external VNC client as well as the existing Java applet.
- VNC window can be disabled entirely on a per VM basis.
- App Center can now be searched/filtered. This is useful if you offer a large number of appliances.
- Admin Dashboard now shows graphical whole cluster resource usage.
- Network Manager has been removed. All deployments are recommended to use DHCP for IP assignment going forward.
- Various performance improvements have been added at customer UI level.
- Various performance improvements have been added to infrastructure code.
Enomaly's Cloud Service Provider Edition extends our core ECP platform, already used by thousands organizations around the world, with the key capabilities needed by xSPs, carriers, and web hosting providers who want to offer an Infrastructure-on-demand or IaaS service to their customers. Enomaly ECP Service Provider Edition provides a powerful but simple customer self-service interface, customer-facing REST API, theme engine, strong multi-tenant security, a hard quota system, and flexible integration with your billing, provisioning, and monitoring systems.
Screen Shots (click to enlarge)
Sunday, November 8, 2009
As someone who spends his days eating, breathing and sometimes drinking cloud computing, it's fun to see how the debate has recently devolved into a debate purely focused upon the finer semantic nuances of the various terminologies. The debate seems to generally focus on the varied usages within the companies that are attempting to "cloud-ify" themselves & their products/services. This cloudification seems to be the trend du'jour within the technology industry, an attempt to augment marketing materials and or product positioning to include cloud related buzz words, whether they make sense or not.
Actually one of the better stated criticism comes from Oracle CEO Larry Ellison who observes that cloud computing has been defined as "everything". It's everything and nothing in particular, a trendy word that is used more to impress than explain a particular problem. I for one completely agree.
As a marketing term, cloud has enabled us to broadly define the movement away from the desktop / server centric past to the cloud [Internet] enabled future. Wikipedia's cloud definition says it well, "it is a paradigm shift where technological details are abstracted from the users who no longer need knowledge of, expertise in, or control over the technology infrastructure "in the cloud" that supports them". Yup, enough said.
This message is to of you -- the ones who are jumping on the cloud bandwagon, let me say this as plainly as possible. Regardless of whether it's "the cloud" or "cloud computing" it all comes back to the fact that it's a buzzword. A way to say we're cool, we're now, we're new, with out saying it directly (a neologism). It's the New Coke of Computing / the new taste of the Internet.
So what is The Cloud? It's the Internet. And what is Cloud Computing? It's the next big thing in computing, it's using the Internet.
Friday, November 6, 2009
More specifically it was created with an open collaboration model in mind where both large companies and individuals can equally collaborate without fear of legal ramifications. Using the OWFa the actual spec development can be done in any forum the participants choose (Unincorporated Google groups / Social Networks, non-profits, startups, Enterprises, etc.)
I'll also be the first to point out that one of the key authors is David Rudin, a Microsoft Standards Attorney. But regardless of Rudin's employer, this is a well thought out document and I for one am very excited by the potential usage of OWFa within a variety of standards processes. I believe that OWFa has the potential to dramatically effect the way we as industry both collaborate and innovate when it comes to the development of common truely open standards, whitepapers and best practices. I encourage anyone who truly believes in the creation of an Open Web to take a look the OWFa.
You can download a copy of the the final draft from here.
To start things off, below is the first in what I hope will be many of these. I'm calling this new feature Transient Ambiance, The mood evoked by my ever changing environment.
When I recorded this entry using my iPhone Voice Memo App, I found myself in the London Underground. As I waited in the tube station for my return to Heathrow airport, I was in the midst of one of those strange surreal moments. With only a Violinist and myself in the middle of a typically busy London underground station. A momentary period of solitude in an otherwise hectic week of meetings and presentations. As I sat pondering life's mysteries, a soft melodic music echoed off the dark, damp underground walls.
Download here (MP3, 2.95mb, timing:3.13)
The scope will include Standardization for interoperable Distributed Application Platform and services including Web Services, Service Oriented Architecture (SOA), and Cloud Computing. SC 38 will pursue active liaison and collaboration with all appropriate bodies (including other JTC 1 subgroups and external organizations, e.g., consortia) to ensure the development and deployment of interoperable distributed application platform and services standards in relevant areas.
Similar to other ISO initiatives each member country that’s interested in participating in this group will come up with their own structure to provide feedback on work items and establish voting positions, including the InterNational Committee for Information Technology Standards (INCITS) who will be the US TAG.
Administrative support and leadership of SC 38 will be provided as follows:
The US National Body will serve as Secretariat for the SC and its Working Groups, and Dr. Donald R. Deutsch from the US National Body will serve as the Chair for the SC. The National Body of China will provide Ms. Yuan Yuan as the Convenor of the Working Group on SOA. The US National Body will provide the Convenor of the Working Group on Web Services. The National Body of Korea will provide Dr. Seungyun LEE as the Convenor of the Study Group on Cloud Computing. The National Body of China will provide Mr. Ping ZHOU as the Secretary of the Study Group on Cloud Computing.
I’ve pasted the complete resolution in detail below.
Resolution 36 ‐ New JTC 1 Subcommittee 38 on Distributed Application Platforms and Services (DAPS)
JTC 1 establishes a new JTC 1 Subcommittee 38 on Distributed Application Platforms and Services
(DAPS) with the following terms of reference:
Title: Distributed Application Platforms and Services (DAPS)
Scope: Standardization for interoperable Distributed Application Platform and services including:
• Web Services,
• Service Oriented Architecture (SOA), and
• Cloud Computing.
SC 38 will pursue active liaison and collaboration with all appropriate bodies (including other JTC 1 subgroups and external organizations, e.g., consortia) to ensure the development and deployment of interoperable distributed application platform and services standards in relevant areas.
As per the JTC 1 Directives, SC 38 will establish its own substructure at its first meeting. Based on discussions at the JTC 1 Plenary, it is anticipated that SC 38 will initially establish subgroups as follows:
a. A Working Group on Web Services
o Draft Terms of Reference:
i. Enhancements and maintenance of the Web Services registry (inventory database of Web Services and SOA Standards).
ii. Ongoing maintenance of previously approved standards from WS‐I PAS submissions, ISO/IEC 29361, ISO/IEC 29362 and ISO/IEC 29363.
iii. Maintenance of possible future PAS and Fast Track developed ISO/IEC standards in the area of Web Services.
iv. Investigation of where web service related standardization is already ongoing in JTC 1 entities.
v. Investigate gaps and commonalities in work in “iv” above.
b. A Working Group on SOA
o Draft Terms of Reference:
i. Enumeration of SOA principles.
ii. Coordination of SOA related activities in JTC 1.
iii. Investigation of where SOA related standardization is already ongoing in JTC 1 entities, and
iv. Investigate gaps and commonalities in work in “iii” above
c. A Study Group on Cloud Computing (SGCC) to investigate market requirements for standardization, initiate dialogues with relevant SDOs and consortia and to identify possible work items for JTC 1.
o Draft Terms of Reference:
i. Provide a taxonomy, terminology and value proposition for Cloud Computing.
ii. Assess the current state of standardization in Cloud Computing within JTC 1 and in other SDOs and consortia beginning with document JTC 1 N 9687.
iii. Document standardization market/business/user requirements and the challenges to be addressed.
iv. Liaise and collaborate with relevant SDOs and consortia related to Cloud Computing.
v. Hold workshops to gather requirements as needed.
vi. Provide a report of activities and recommendations to SC 38.
Topics related to Energy Efficiency of Data Centers are excluded. On topics of common interest (such as virtualization), coordination with the EEDC SGis required.
Membership in the Study Group will be open to:
1. National Bodies, Liaisons, and JTC 1 approved PAS submitters
2. JTC 1 SCs and relevant ISO and IEC TCs
3. Members of ISO and IEC central offices, and
4. Invited SDOs and consortia that are engaged in standardization in Cloud Computing, as approved by the SG
In addition, the Convenor may invite experts with specific expertise in the field.
Meetings of the group may be via face‐to‐face or preferably by electronic means.
The SC 38 Secretariat will issue a call for participants for the Study Group.
The SGCC Convenor is instructed to provide a report on the activities of the
Study Group at the SC 38 2010 Plenary meeting.
Administrative support and leadership of SC 38 will be provided as follows:
a. The US National Body will serve as Secretariat for the SC and its Working
Groups, and Dr. Donald R. Deutsch from the US National Body will serve as the Chair for the SC.
b. The National Body of China will provide Ms. Yuan Yuan as the Convenor of the Working Group on SOA.
c. The US National Body will provide the Convenor of the Working Group on Web Services.
d. The National Body of Korea will provide Dr. Seungyun LEE as the Convenor of the Study Group on Cloud Computing.
e. The National Body of China will provide Mr. Ping ZHOU as the Secretary of the Study Group on Cloud Computing.
Monday, October 26, 2009
When I speak at conferences my presentations have shifted from "what is" or "how does" to "where do we go from here?" It seems that somewhere along the way people started asking me to act as a kind of futurist or more specifically, to speculate about the future. And funny as this may sound, I actually quite enjoy this new role of prognosticator. So in keeping with the theme, I'm going to prognosticate a little bit on this October evening in the year 2009.
So what's next you say? Some say the semantic web, to me this is obvious, just applying semantics to what is already underway. Other's say ubiquitous computing, I say this an extension to what has for many already become ubiquitous. Others say maybe green, maybe mobile, maybe global. Regardless of the specifics, technology is quickly becoming the central aspect in many established economies. In other parts of the world having a mobile phone is on par with a basic civil right like water or food. To not have this, the most basic form of communication is in a sense, to not exist at all.
Yet there is a disconnect between the highly connected western world and highly socially interconnected emerging economies. In looking at the opportunities for technology in the coming decades and beyond, stop focusing on the specific features & functions within technology. Look to the opportunities for social change on an individual basis. The enablement of those who previously were not enabled.
I believe that the biggest opportunities will be for enabling those with the least. So my prediction is simple, Want to Get Rich? Sell to the Poor. Places like Brazil, China, India and Malaysia are in the midst of rapid and amazing transformations. These transformations are being brought about by access to technology that was previously unavailable. Access to information and affordable technology will be at the heart of this transformation. Feed this hunger and you will empower the people within these economies. Rather then focus on globalizaton, focus on regionalization. Focus on how technology will effect that single human being.
What is clear is the 20th century belonged to the West, and the 21st will belong to the rest.
It's interesting to note that both Microsoft and Google released completing "open" initiatives today with Mircrosoft announcing they are opening the PST format for Outlook. It's great to see both companies actively battling it out for "Open Cloud" supremacy.
In the Blog post earlier today, Google outlined the ability to "Select one or more files and then click on "Export" from the "More Actions" menu. Next, pick the format (e.g. PDF, Microsoft Word, etc) you want for your exported files. Finally, click "Continue" and we'll give you a nice zip file to download that has all your content."
According to the post, "For now, you can "export" up to 500 MB of content in a single zip file, which is over 20,000 typical files. Sometimes it takes us a few minutes to export really large amounts of files, so instead of making you wait, we added an "Email when ready" option. We'll send you a link when the zip file is ready."
Give it features a whirl and let Google know what you think.
See the Video below, and I have no idea what I'm doing with my hands at the beginning of the Video, guess I was nervous or something. ;)
Lorimer said that "In order to facilitate interoperability and enable customers and vendors to access the data in .pst files on a variety of platforms, we will be releasing documentation for the .pst file format. This will allow developers to read, create, and interoperate with the data in .pst files in server and client scenarios using the programming language and platform of their choice. The technical documentation will detail how the data is stored, along with guidance for accessing that data from other software applications. It also will highlight the structure of the .pst file, provide details like how to navigate the folder hierarchy, and explain how to access the individual data objects and properties"
He also admitted that that the documentation is still in its early stages and work is ongoing. Going on to say "We are engaging directly with industry experts and interested customers to gather feedback on the quality of the technical documentation to ensure that it is clear and useful. When it is complete, it will be released under our Open Specification Promise, which will allow anyone to implement the .pst file format on any platform and in any tool, without concerns about patents, and without the need to contact Microsoft in any way."
This initiative is part of Microsoft's Interoperability Principles, which they announced in early 2008. As part of this initiative Microsoft has committed product features, documented formats, and implementation of standards that allow interoperability. The move to open up the portability of data in .pst files is another step in putting these principles in action.
Lorimer also said that "Over the past year, Microsoft Office has taken several steps toward increasing openness and documenting interoperability guidelines, offering customers a choice of file formats and embracing a comprehensive approach that includes transparency into our engineering methods, collaboration with industry stakeholders, and shared stewardship of industry standards"
This is a great move by Microsoft!
Wednesday, October 21, 2009
As many of you know I've been pushing for the creation of a formal Cloud Computing Trade Association for quite sometime, but unfortunately we've lacked both the will of the industry and the money to make this happen. Given the recent announcement of a EuroCloud Cloud Computing association in Europe, the opportunity to do something here in North America or more broadly has re-emerged. In my previous discussions there has been a tremendous amount of interest in the creation of an International Cloud Computing Trade Association, but when it comes down to it, the money to fund such an endeavor wasn't there.
So I again would like to pose the the question if now is the right time to reengage this conversation? Or am I beating a dead horse?
As pointed out by Fred Zappert on the CCIF list early today, more then thirty companies have joined forces to form a new European Cloud Computing Association called EuroCloud. The purpose of this new group is to bring together Euro-based SaaS and cloud computing vendors, enablers, integrators and industry experts to share best practices and promote new business opportunities across the continent.
The creators of EuroCloud have done a great job of outlining the rationale behind why they created the trade org. Similarly these points could apply to an International Cloud Organization.
- Europe has a fast growing SaaS and Cloud Computing industry, but each country is currently operating separately with few contacts in other European countries.
- National SaaS vendors are growing and are looking to build European and international relationships through business and technological partnerships.
- The European Authorities do not currently recognise the European Cloud Computing industry, which is an industry that can help stimulate the economic and technological environment to promote new Cloud Computing industries.
- Cloud Computing implies application integration into an Application-Oriented Ecosystem. Developing new application partnerships, both european and worldwide represents the next crucial step.
Eurocloud, the pan European Cloud Computing business network, through action at both local and European levels will help to answer these new demands.EuroCloud Goals
- To build a pan European network organized in two tiers with a national level (France, Espagna, England, Belgique, etc.) and a European level. The national level focuses on local topics and the European level on European topics,under the EuroCloud brand (or another if appropriate in a national setting). Only companies who have an interest in Cloud Computing and participate in the Cloud ecosystem can be members of the network.
- Build relationships with the European authorities (Commission and Parliament) to help recognise the Cloud Computing industry as the future of IT in Europe and to promote a stimulating environment for development and growth of the industry.
- Promote business relationships between members throughout Europe and internationally with counterparts such as SIIA.
- Promote technological relationships between members throughout Europe and internationally.
Redux: The Case for a Cloud Computing Trade Association (Originally posted April 13th 2009)
In a recent report published by Gartner, the market research firm outlined the tremendous opportunity for global cloud services projecting revenues to increase 21 percent this year alone. According to Gartner cloud-based offerings made $46.4 billion in 2008, a number that is projected to increase to $56.3 billion in 2009 and $150.1 billion by 2013. With this phenomenal growth and revenue is expected in the cloud computing sector, a few in the cloud industry have begun to ask whether it is time to form a cloud computing industry trade association. As many of you know I have been pushing for the creation of such an organization for awhile. I thought I'd briefly layout some of the opportunities I see for the creation of a cloud computing trade association and how it might look.
Generally the idea is for the creation of a formal trade cloud computing association founded and funded by businesses that operate in a our industry. The industry association would engage in various public relations activities such as advertising, education, lobbying and publishing and certification, but its main focus is collaboration between companies.
First of all I'm not advocating for the creation of an organization focused on a particular ideology such as software licensing models or source code development but instead focused on accelerating the adoption of cloud computing by encouraging broader industry collaboration. More simply a formalized "legal" trade association which brings together both small & large companies (startup or enterprise), while also bridging the greater cloud community by including customers & users who all share a stake in the adoption / advancement of cloud centric technologies.
I believe the association should focus on the commonalities we share -- accelerating the adoption of cloud computing through a consensus view of the general opportunities cloud based technology brings to customers. I'm not speaking about defining what cloud computing is so much as defining the problems it solves and opportunities it enables. The things we can actually agree on.
To accomplish this there are a number of joint advocacy and marketing programs which the association may engage in. These could both include web centric activities including industry forums, social networks and online collaboration tools as well "in person" activities such as local user groups, unconferences and trade shows. The association may also be in a position to assist in the creation reference architectures, use cases, and white papers that help illustrate "how" & "why".
Some of the concerns I've heard repeatedly is the potential barriers to entry for participation in this type of association. The last thing this association should be is an inclusive club for a few select technology vendors and insiders. It needs to be available to all and should foster an engagement with both the existing community while also providing a formal / legal umbrella that the larger companies will feel comfortable participating in. I am also cognizant that it takes money to make money, so there needs to be a middle ground with potentially some of the larger vendors subsidizing the involvement of the smaller players and independents members. Simply membership should not be cost prohibitive.
The question of standardization also seems to keep reoccurring and is probably one of the most debated of the topics when discussing the creation of a cloud trade association. It is my opinion that that last thing the world needs is yet another standards body. There are dozens of existing groups and organizations that would be ideal partners. I say lets work with them. I would guess a good portion of the members of such a trade association would also have memberships in existing standards bodies already. I agree that there is no need to reinvent the wheel or boil the ocean. I'd rather see this association partner with the standards world rather then compete with it.