Cloud Computing versus Hosts
Hi…
I found an article – actually I have seen lots of those in the past few weeks – which compares hosts to what cloud computing is about today. Mostly the conclusion is “see, we already had that all and we should never have left host computing in the first place”.
Well, this is true and wrong at the same time.
Certainly some of the basic principles used in cloud computing today were also used within hosts. I even admit they might have been designed for hosts in the first place. But let us step back for a second.
The Software Crisis
Today we still live in the time of the “Software Crisis”. First what does this mean?
If we see the whole thing as a market. We have the human society (including economics) defining a demand for solutions. On the other side we have the IT industry trying to fulfill the demand. I do write “trying” here because the industry is not able to saturate the market. And this is true ever since the birth of the industry. The amount of computing power that the human society asks for is overwhelmingly more than the industry can produce.
So how does such a solution look like? Well, normally the whole process starts with a problem that might be solved or at least weakened by applying information technology. We call the result the solution accepting that the problem still might not be solved to its full extend. The solution for a problem is a function of software and hardware.
On the hardware side we all know Moore’s law. While we still have problems that extend the hardware capabilities we are today in a somewhat luxury position regarding hardware. Hardware is highly componentized and standardized. Most of the times it is no longer the limiting factor.
The crisis sits on the other side – within the software. While we do have components and standards here, too, it is still behind the hardware (I never saw something like the CSS standard quality wise in the hardware land, anyway…).
The crisis is even so bad that we are not able to “produce” enough skilled staff to work on the problems.
So how can the system react on this pressure?? Certainly by growing the number of people working in the field but since this cannot solve the problem another potential way is to lower the entry barrier and to streamline the process of software production.
If you now turn around and have a short look on how we produce software you will see that there is this fundamental tendency to reduce complexity. Take scripting languages for an example. They reduce complexity by even paying the price of being not optimal in a sense of memory or processor consumption. We shift the burden of the crisis from the software to the hardware side which is able to scale better.
Not every problem is a NASA Mars mission
Here one can ask for a new measure called software quality. While still experts discuss what it really stands for it is not necessary to have it defined in all consequences for this idea here. We all have a basic idea what software quality stands for: Availability, memory consumption, user experience, etc. In short: The software should always behave as expected.
We all understand that quality costs effort which is directly related to developer time. Therefore high quality software is costly. So whenever a problem solution is thought through it is certainly right to ask if the return on the invests is within the desired level. And there are some parameters one could tweak to get it onto the right path: Less software development versus more hardware usage. Try to reach a high volume market to divide development costs by more customers. Less testing and accepting the product “to be as it is” versus NASA Mars mission tested. In the end it is a quality decision to make.
Take a product like Windows for example. It is a tremendously complex product (I heard once that it is more complex than the whole flight to the moon project because it is an open system which must be able to work with existing and yet to be designed software alike). But the effort is worth it because the burden is divided by billions who use it in the end. This is the NASA Mars mission type of software: Quality rules!!!
Now let’s take my family web site. I hammered it on a rainy Saturday, it is not really tested at all and I would be willing to reboot the whole thing from time to time. The interest in raising the quality of the whole thing is limited… at least from my point of view.
Back to the Cloud
Since the industry is struggling to overcome the basic crisis in the first place host computing was a valuable step in the process. And as Dirk Primbs always says “We can introduce new technologies but once introduced it will never go away!” ideas and algorithms developed for host computing are still part of the fundamentals but in the end the host did not solve the problem.
One thing which was the most remembered characteristic of host computing is also its killer: The enormous software quality. The high level was bought by a high invest at the first place…but not all software is a NASA Mars mission.
Cloud computing takes the thing to a new level. It democratizes IT in a sense that it not only offers “unlimited” hardware it also lowers the entry barriers to develop software. Quality of the whole solution is a “can” not a must. The system – even the users – are more fault tolerant and scripting away is acceptable. If the idea really ignites we will see something like the Cambrian explosion in software development taking place.
Today’s startups just do the right thing when developing the solution a bit sloppy first hand. If it does not fly, well, at least the invest was not to heavy. If it flies increase quality from release to release … or simply go dark for a week like twitter did ;-)
The Cambrian explosion was certainly an interesting time to live in. And actually we are still living on the credits of that age…
CU
0xff
Comments
- Anonymous
April 11, 2009
PingBack from http://microsoft-sharepoint.simplynetdev.com/cloud-computing-versus-hosts/