To Cloud Or Not To Cloud

The Cloud

It's the new buzz word.

In my last post I discussed Bleeding Edge versus Cutting Edge technologies. When a new technology suddenly becomes all the rage, you are beginning to see that technology starting to push from the bleeding edge into the cutting edge. It is no different here.

People have been talking about "the cloud" for some time now. Recently both Apple and Microsoft jumped on board with their own cloud. Amazon has been doing it for a long time now.

But just what is this "cloud" anyway?

According to Wikipedia, Cloud Computing is "using multiple server computers via a digital network, as though they were one computer" (See Details Here).

But the definition has really been warped. Now when people talk about "being in the cloud" they usually mean that they are using, at least in part, applications that reside on other computers on the internet. Simply stated, saying "the cloud" is marketing speak for "on the internet".

So what is the big deal anyway?

While it may sound as if marketing is just trying to pull the wool over your eyes, there really is something different about the use of the internet that people are trying to communicate when they say "the cloud". What people are trying to emphasize is not some new-fangled technology, though new technology is making it easier and better, but instead it is a new way of using the internet.

Back in the early days of computing users had to sit down on dummy terminals which plugged into a very large, centrally located computer. Computers were too big to fit on a desk or even in someone's office. They were also too expensive to purchase a computer for each individual.

This centrally located computer made it easier to manage allowing staff to support and upgrade everyone by managing a single computer. It also helped take large expenses and distribute them across many users increasing the value for the dollar. The problem was that you had to physically connect your terminal to the mainframe. This usually meant computer users had to go down to a computer center which was less than convenient. It also meant computers were not easily available to the general public and usually restricted to large companies or educational institutions.

Once the personal computer became small, easy and affordable, there was a shift away from the mainframe computers. Now people were freed up to have a computer in their home or office. They could install whatever software they wanted. And they were not restricted in use based on what their company or university policies imposed.

When the internet started to become in wide use, computers began to become interconnected again. At first, the internet was a large combination of individual computers communicating seemingly at random as needed.

The new problem was that people began to take on the responsibility of setting up and maintaining their own computers. From installing software, troubleshooting problems to dealing with viruses and hackers, individual users were required to become somewhat technical. And you had to install and continually upgrade your own software which wasn't exactly cheap.

Over the last several years we have seen a new shift. This is the shift to "the cloud" everyone is talking about. In essence, we are beginning to go back to the old mainframe model. But this time it is better.

How is it better? First of all, with the wide reach of the internet, the vast connectivity and wireless technologies, you can get on "the cloud" from just about anywhere. Secondly, you can do it from your personal device. This gives you the freedom and control that was never a part of the mainframe.

The exciting part about "the cloud" over the individual personal computer is that of the time and cost to the individual. As the internet speeds improve, dead zones disappear and more software is made available as an online service, our personal computers and devices become simpler and cheaper. More of the raw processing power is used to servers. More of the security and protection from hackers and malware is placed on the service provider. And the less power is required of your device (which translates to cheaper, smaller, lighter weight and less power usage).

But while we keep hearing about the greatness of the cloud, it is still a Bleeding Edge technology. There are some services that work great like e-commerce web sites, online news, video services and online backup services. But most of the best technologies out there are either really simple user interfaces or services that still require software be downloaded and installed on your own computer.

To be a pure cloud application (pure being a somewhat subjective term here), you would want the whole application to run on a server while the only thing that runs on your computer is a simple user interface that interacts with that server and allows the server to do all the work. We see this with some online games like web apps (Google Docs and web stores), MMORPG's (World of Warcraft and Everquest) and Flash based applications (Gliffy).

The thing we hear probably a little too much of now is how clients want to be a part of "the cloud." To an extent I do agree. But we must be careful here. There is a reason most applications we use today either not cloud based or only partially cloud based and still require software to be installed and run on your own computer. Understanding the limitations of "the cloud" is important in making sure you stay on the cutting edge and don't slip over into the bleeding edge.

The first and probably most important reason most applications are not purely cloud based is speed. Google is desperately trying to expand the new faster internet connections throughout the US. But this takes time. For now, many of the purely cloud based applications are painfully slow because the internet is slow. Yes, it is faster than it used to be and it is getting faster. But it is also much more heavily taxed by increasing demand.

The second is that the technologies to help deliver pure cloud based services is not quite mature. To deliver an application to any device requires that application developers write their software three or four times. This means instead of writing one application, we have to write three or four applications thus multiplying the cost by three or four times. Additionally, the software frameworks for the different platforms is somewhat limited. On top of it all, the standards for how the frameworks are slow to be defined and then adopted (this is especially true with web applications - what works on one browser doesn't work on another or what works on one operating system doesn't work the same on another).

Another change that is occurring but still a bit slow to be adopted is the acceptance of this new model by consumers. Some of us still like to work "off the grid." Others want to "own" their software and are not comfortable with a "leasing" model most cloud services provide. There are privacy and security concerns as well.

There is no doubt, "the cloud" is slowly moving from Bleeding Edge to the more main stream Cutting Edge. It is an important part of our future in computer and device usage. And it promises to provide a much richer and more efficient experience for users. But in many regards it is still Bleeding Edge. So if you want to cloud enable your application, be sure to look at the options. You don't have to go pure cloud - quite often a mix of approaches can provide the best of both worlds merging convenience, low cost and ease for users.