Wednesday, April 30, 2008

Stop Meshing with Cloud Computing

The buzz phrase effect has finally kicked in with Cloud Computing. A few years ago "Web Services" meant an XML message between two web systems using a simple protocol. When it became a buzz phrase that enterprise marketers got a hold of it, Web services became anything that did something (they wanted to sell) online. The same is starting to happen with Cloud Computing. We often try to define what a technology is (standards) by what a technology is not. Here's my exclusion list:

1. Cloud Computing is not a Web portal - My Yahoo, iGoogle and the like are not examples of Cloud Computing. They may run on the cloud, the may even interface to cloud based systems but they are not by definition cloud computing. Why? because they are not "computing" platforms, they are very interesting Web properties.

2. Microsoft's new Mesh is not cloud computing - Let me quote the source "Live Mesh puts you at the center of your digital world, seamlessly connecting you to the people, devices, programs, and information you care about— available wherever you happen to be." HUH? I think there are clouds involved in that statement but they aren't computing clouds. This is another example of MS being lead by people who don't understand who their competition is.

3. Virtual Servers at (you name it, godaddy, rackspace etc.) is not Cloud Computing - Virtualization, while a very powerful tool in the advancement of cloud computing, does not by itself make a solution Cloud Computing. The missing element is cloud abstraction. Amazon's EC2 would not be a cloud computing platform if I had to call a sales person to buy a virtual instance. My virtual instances in EC2 are transient and I am paying for them in a utility model. I can stand up and shut down instances with a web service command (yes, a TRUE web service).

In my opinion there are three technology candidates for the title of Cloud Computing Systems.

1. The portable Web - deployable widgets, portlets or other elements that are built on top of Adobe Air, Google Gears or even (big throw back) a Java applet. These computing elements can be designed to serve a specific computing function and be deployed across multiple platforms without concern for the underlaying infrastructure.

2. Autonomic virtualized clouds - Amazon Web Services, Microsoft SSDS (when it actually happens) or VMWare based solutions for computing or storage in an on-demand configuration that provides utility computing from a Web connection.

3. Virtual Application Servers - Google Apps Engine is a great example of where Cloud Computing is going. The Google implementation on Python with it's limited API will not draw enterprise customers but it does give us the first working "appserver in the cloud". Where Google shines in this regard is simplicity. Relatively lower skilled programmers can quickly develop working applications without any concern for the underlying implementation of the services they are using.

Hopefully the message of Cloud Computing won't be completely drowned out by the over use and abuse of those who are simply trying to ride it's coattail. Maybe we can start a new trend by coining another buzz phrase. I'm not sure what the next buzz phrase will be, I'm in favor of the Semantic Cloud Computing Web Service 2.0. What do you think?

Saturday, April 26, 2008

Who's paying for cloud computing?

With the advent of cloud computing for the mainstream (read any smart coder) we have seen an explosion of applications that fall into the category of "web 2.o" and social networking. The biggest majority are being hosted on Amazon AWS but with Google Apps Engine and Microsoft's announcement of an S3 alternative, competition has already begun to take effect.

One of the lessons that Amazon has learned from Google is the concept of micro payments adding up to big bucks. Every day I spend a few dollars on Google Adwords to get traffic to DigitalChalk and I spend a few dollars running server instances and streaming videos on Amazon. I'm not making Amazon or Google rich but I spend money with them every single day. There are no other vendors except my utility companies that have their hooks into me that way. I'm not complaining, to the contrary in fact, I can stop spending with either vendor with a single mouse click. I'm choosing to spend money with them because it has a very easy-to-measure return.

To my point, I read today on Charles Coopers blog about the death of Web 2.0 and the emergence of the Semantic Web which as he defines it, is cloud computing running web 2.0 apps. I think his observations are mostly on-target but I think we are going to see a the natural migration from early adopters (like me) to mainstream adoption (enterprise) as the next phase on the cloud.

We recently started working with other businesses interested in migrating from a traditional data center to Amazon EC2 & S3. The payback is typically less than a year and the option to scale will open up new doors to growth. It's only a matter of time before smart CFOs and CIOs in the enterprise market start noticing the benefits of saving 80% on bandwidth and starting up dozens of cloud servers with a single command. The mainstream is starting to realize the real risk is in staying with status quot. Now that Amazon is offering a full SLA I think we will see more mainstream adoption of these cloud computing platforms.

If you want to hear more about how we build our system on Amazon, you can email me at tonymccune(at)gmail(dot)com or visit us at DigitalChalk.

Thursday, April 17, 2008

Creative destruction in action

Listening to the debate about cloud computing and the possible impact on jobs within technology companies has me thinking that perhaps the wrong question is being asked. It's not IF cloud computing is going to take hold, it's when. The real question is what will you do with it?

This debate reminds me of the questions I use to get in the 2001-2004 time frame about Open Source in the enterprise. Many CIOs would ask me if I though anyone was using open source software for actual enterprise systems. My typical response was to ask them about the commercial software they were running. Almost invariably they would name a product that had significant open source components. The same thing is happening now, with Web 2.0 applications being used in the enterprise. Many companies are jumping into secondlife to do group events or using Twitter to notify groups of non-critical news and events. I even heard recently about a municipal fire department who is considering using twitter as an alternate method of notifying the community in the event of an emergency. And yes, you guessed it, both of these two examples run on the Amazon Web Services cloud computing environment.

Where does this leave us? It's just another turn on the technology innovation and destruction cycle. As we push older technologies (read operating systems) down further into the stack, it's going to continue to get less mission critical and less expensive. Geoffrey Moore described this process well in his 2005 book, Dealing with Darwin.

Can the cost of servers be driven down to zero? Yes, absolutely, if a disruptive technology comes along that let's us run software in thin air. Until then the cost of computing cycles will continue to drop in a steady rate. The next iteration has already arrived. We will see most applications in the cloud running in an on-demand application service within a few years. Google app engine is the first into that game but there are sure to be more coming quickly.

Wednesday, April 2, 2008

Community generated content = quality?

I recently read a well written posting by John Warner in the Swamp Fox that discussed the value of Wikipedia and how some academics were refusing to let students reference that as a source. It occurs to me that there is a certain element (unfortunately many teaching our youth) that value singular credentials over majority consensus.

Although I am firmly in the camp of consensus, it's worthwhile to examine both points a bit more. Is the expertise of one or two highly specialized contributors worth more than the combined contribution of a large community of contributors? I think it's probably a little of both that brings quality. We don't gain value from a mob of uninformed opinions nor do we get the most well thought out explanation from an expert who has not been challenged in his or her basis of belief. The best outcome would be based on a community of well informed contributors.

Those of us who are willing to wade into the fray on a topic we are less than expert in provide a couple benefits, first we force the expert to communicate in a more complete fashion to address a broader audience. A further benefit is the introduction of fresh and unorthodox inquiries that often become the catalyst to new thought on a topic.

Open Source software seems to be following this pattern. Over time we see that a community of contributors in any given project generate a much broader consensus of what a technology should look like and include. In the early days of my participation in Open Source it seemed that the biggest benefit was the volume of code made Open Source attractive. Now I think the community (at least around larger, longer established communities) brings it biggest value in the quality of mind-share it contains. It's fair to say that nowhere else in the history of technology has there been a time when such diversity of thought and objectives been focus onto a single objective.

I think we live in some very interesting times. The time when a single entity can control the future of a technology is past. The new currency in technology isn't code, it's creative use of code. Solving a problem and building a base of loyal users is how we will make money in technology for the foreseeable future. So, taking care of customers and giving people what they need to be successful will make you successful? Interesting concept.

Twitter Updates

    follow me on Twitter