The technology world is all aflame about “cloud computing”, and how businesses are supposed to move all of their stuff into the cloud, or die. Or something. In my eyes, “cloud” is simply a different name for the internet, and cloud computing is simply a different and fancier name for what most internet users have been doing for ages.
Even Wikipedia doesn’t really seem to have a clue as to what “cloud computing” actually is, and its definition is just a different way of describing the use of web applications – something we’ve been doing for a long time, with online mail clients, Facebook, and even things like Bittorrent and peer-to-peer networks. With all the talk of cloud computing, I can’t help but see the similarities between the cloud hype and the dot-com boom. In other words, the classic Southpark underpants gnomes still apply.
Australian website Technology & Business got to interview Microsoft Australia’s Director of Developer and Platform Evangelism, Gianpaolo Carraro, about cloud computing and Microsoft’s upcoming Azure platform. I thought, here’s the chance for Microsoft to explain what, exactly, this whole cloud thing is, and what makes it different from all those things we’ve been doing for ages with and on the internet.
They failed.
The interview delves deep into what Azure actually is, and I can’t shake the feeling that there really isn’t anything new in there, nothing that makes me go “Ah, so that’s the cloud!”. From what I can understand, Microsoft will build a number of datacentres around the world where developers of web cloud applications can rent server space to host their applications. Imagine, if you will, Facebook dismantling their own servers, and renting server space at Microsoft. While this could be very handy for those of us with a great web app idea but without space to run a server farm, it’s hardly revolutionary, nor does it explain what sets “cloud computing” apart.
And, well, that’s about all the interview makes clear about cloud computing. The rest has more to do with the technical aspects or Azure; for instance, while Azure infrastructure is built around .Net, people are free to use Java or PHP as well. We also learned that it will take about 12 months before Azure emerges out of beta.
I’ve read quite some articles on this whole cloud computing thing already, but I simply don’t see how it is anything other than a new buzzword to describe something we’ve had for ages: applications that do not run on your desktop, but on a server somewhere. That’s something we’ve been doing for ages, but I guess the marketing value of the term “web application” ran out.
Carraro states that “there is no doubt that Microsoft is a big believer in the cloud”. That kind of makes sense. Microsoft completely missed the internet and web 2.0 train, and now sees an opportunity to grab a piece of that pie now that its name has changed.
Clever.
From what I’ve gathered about the whole cloud computing idea, though I can hardly say I’ve been searching for it, the difference with ‘traditional’ web 2.0 applications is mainly the fact it runs on a cluster of some kind. A cloud, by most definitions I’ve seen, is a bunch of resources that can be used like one. Having something running in a cloud would then somehow make it available and run it on some resources in that cloud. It’s up to the cloud to decide on that. Also, this would give scalability: instead of reserving resources that might be needed by applications when they are hosted, such resources can now be (de)allocated dynamically. The nice thing of all this is that to the user of the cloud it seems to be one system, including possible abstractions such as OS, filesystem, database, etc.
So, to summarize, we’ve got a distributed hosting service running on a number of machines, exposing one environment to the developer. But that, really, is what I’ve come to think of it.
the non-retarded version of ‘cloud computing’ is the idea of a standard platform across hosting providers that provides a bunch of useful libraries at and stuff.
This platform is supposed to offer transparent scaling, i.e the hosting provider moves your application to more servers automatically as is needed.
The standard platform means that you can move your app between hosting providers without hassle or even run your app across multiple hosting providers at the same time for redundancy.
The problem is that everyone got retard strong and marketing happy and now there isn’t a standard platform, there is just restrictive lock-in hosting on a ‘cloud'(data centre) own by single companies. It’s insane to write an app that only runs on a single hosting provider.
So, what do you mean by non-retarded? Normal?
Geez, could you be any more offensive? Using the word retarded as slang is just offensive and demeaning. Why deliberately hurt people with your cruel words.
It’s supposed to be demeaning, when people come up with terrible useless ideas and market them widely they must be ridiculed or they’ll keep doing it.
‘Retarded’ refers to something that is held back or slowed, in this context it refers to someone that is “slow or limited in intellectual development”
The premise of the statement is that someone who wasn’t limited in intellectual development came up with a useful idea, and a number of people with ‘limited intellectual development’ warped the idea in to something useless and then marketed it.
Unless you are someone that believes in ‘cloud computing’ in it’s widely marketed sense then you have no reason to be offended. If you don’t then you have likely created offense by your own generalising to unrelated people with ‘limited intellectual development’.
– Jesse
retarded – 1. underdeveloped, 2. mentally challenged
Could you be any more PC?
why it doesn’t seem any different than what we’ve been doing for ages is, well, that it is no different. I’ve seen it as nothing but a buzz word from the beginning, along with Web 2.0 (yet another buzz word). Those marketing types have to justify their jobs somehow, I guess this is one of the ways they do it.
Like every other technology on the web, cloud computing is built on top of existing technologies. It’s the logical extension of virtualization. What is cloud computing? It’s the abstraction of computing resources. It’s divorcing the physical hardware from the running of the software.
Here’s an example use case. You have a corporate IT group. When you need to run a new application you request one or more servers from the group. You define the business case for your servers, specify memory and disk requirements and so forth. You wait six months for budget approval, purchasing, installation and validation. You then receive access to your servers.
Now your corporate IT group has converted to an internal cloud computing infrastructure. They have purchased large numbers of identical servers. They have a SAN with a large amount of storage. You need a new server. You define the business case, specify some parameters (you want a firewall, etc.) and submit the request. You wait 15 minutes while the IT group uses the cloud console to allocate new instances. In their next planning meeting the IT group determines that instance growth rate will require new server purchases at the next purchase cycle.
Here’s another one. You’re a QA manager at a small Internet company. Your new application has reached the point it requires load testing. You don’t have the physical machines it will take to slam the application with load. You setup an account on the Amazon cloud and spool up hundreds of load testing instances. You run the load test and shut down the instances. You write a small check.
The only “new” thing in cloud computing is pushing the virtualization abstraction out further then it was. You lose sight of the physical hardware completely. The cloud administrators see it. You, as a computing consumer, don’t. As far as you’re concerned, the computing and storage resources are infinite. In practice you get more efficient use of resources and can support more servers then with raw virtualization.
First right answer (and +1).
To build on that, outside of EC2, cloud computing means plugging in to a propriatary API, with different levels of invasiveness. Google and Microsoft are the worst, where your persistence story ends up completely tied to the way they do things (which is not even close to the way you would normally do things.) Other solutions like heroku (for rails) will magically just work, however you inevitably give up something else (in the case of heroku you are stuck with a read only filesystem).
The other thing about cloud computing is that once you are designing your app for one of these pay for service platforms, it may cost you a lot to get off of it. In the case of heroku or EC2, this isn’t a big deal. But appengine and azure will require figuring out how to dump all the data, and then translate it into a new system (most likely a database, which is completely different), and then go about re writing your persistence strategy (which is typically a significant piece of any application). So once you choose to go with one of these platforms, if they change their rates, or just decide not to offer the level of service you require, they have you by the short hairs.
Even though it is a big buzzword right now, it makes no sense for anyone but a startup to use most of these platforms (again, amazon EC2 is the exception to this with their VPS approach), and even if you are a startup it requires serious considerations of the downsides before you move forward. If you are a big company it is a downright dumb idea, except for the case of EC2 for situations like disaster recovery, or the need to scale immediately (or temporarily).
So “cluster” wasn’t sexy enough; we had to rename it “cloud” to indicate that we where doing the exact same thing yet again? Oh no.. we’re not hosting that program on the AS400, it’s in the company “cloud”.. which is hosted on the AS400.. it’s totally different.
I did like the clarity of your examples though and fact that you focused on computing within the organization. One of my personal problems with the “cloud” hype is allowing insecurity and loss of control by moving intimate data beyond the limits of the company network. Centralized resources make sense within the confines of the company. Involving an untrusted third party in providing company storage and software is madness.
OMG A new Buzzword! Yays!
I am so sick of analysts. Has anyone else noticed there isn’t much difference between them and an anal-cyst?
Both can cause long-term disruptions.
Rofl. I’m sure someone will mod that down as irrelevant or some other childish thing, but it won’t be me. That’s the best laugh I’ve had so far this week.
Heh… well, we (briefly) had a mail filter on our work server that would have blocked both of them
http://www.penny-arcade.com/comic/2009/3/25/
I fell off my chair when that one popped up last week. I’d like to think that the novelty of the buzzword is finally wearing off given the discussion here so far. Sadly, among non-tech and the mass media bubble 2.0.. er web 2.0.. er.. cloud.. er.. centralized server/thinclient computing is still seen as some kind of new and magical use if internationally networked computers.
Anyhow, I think this was a great article. Maybe with enough of them pointing out the balatant falicy of the cloud and version numbering hype it will go away. Another outcome would be watching all those webapps evolve to finally sovle the things that make cloud computing nothing more than marketing; true security, network fault tolerance, hosting company data access..
My favorite registered domain name back in the dotbomb era, and no doubt owned by some hopeful entrepreneur looking forward to getting rich, was “buyfreshclamsonline.com”. Perhaps I’d do well to domain-squat at “buyfreshvirtualclamsonline.com” this go-round for an easy, low risk profit.
Cloud computing is more about transitioning the basic infrastructure of software from a cottage industry to a ubiquitous service.
Very few businesses (except those with special conditions pertaining) have their own electricity generation facilities – except as an emergency backup. Everyone relies on the grid.
In 20 years time, noone will have their own server infrastructure (except as a backup for critical systems); everyone will rely on the cloud.
That’s what cloud computing is all about. It requires technology innovation at both the hardware, fabric and software layers. And if it all goes well *noone* will remember what this “cloud computing” thing was all about, but we’ll all be using it.
Great. And, if that happens, our privacy is now nonexistent as opposed to being almost nonexistent, and the right to access our data on our own terms, and with our own choice of program, goes straight down the toilet. No thanks, I’ll be keeping my personal computers, even if the whole world goes to web apps.
Whenever “Cloud Computing” becomes the norm, all personal computers, of every sort, will quickly become obsolete. Software “off the shelf” will dry up. Your OS will become unusable, un-updatable, unprotected.
Look at Windows 98SE. I swore I would never get Windows XP, when it came out, because of Microsoft’s “Activation”. Several years later, I try installing Windows 98SE on something and I can’t find drivers… can’t find apps… can’t find/update antivirus/spyware, etc. And Web browsers grow dim to locate.
The same will happen for XP probably fairly soon. And then Vista. No matter how much you would like to fight “the system”, you WILL be forced to go along, whether you like it or not. I thought I could escape “cloud computing” by being on Ubuntu… can’t escape it there, either!
Privacy, Individuality, and True Christianity will soon become obsolete and considered “dangerous”. Everything in this world is leading people to becoming part of “the system”. Cloud Computing, RFID, contactless cards, etc. Enjoy what you have, now, while you have it… for eventually it will be gone.
Cloud Computing is like the seeker in hide and seek… it’s done counting and is now calling out… “Ready or not, here I come!”
Perhaps the commercial oses will die out, but I doubt the open source ones will. As long as there are those who are willing and able to continue working on Linux, the BSDs, Haiku, etc then they will never die. The world isn’t limited to Windows, nor will the world be limited to the cloud, if indeed this ever happens. I don’t picture it happening, for the simple reason that people do want to be able to work on some things without requiring an internet connection. There are quite a few parts of the world where such a connection still costs huge amounts of money, and when all someone really wants is to work on and print a document, for example, it would be pointless. And I won’t even comment on the Christianity bit, as my views (totally opposed to Christianity in all forms) would start a nasty flame war, and we’ve already had one of those on that very subject about a week ago.
Google_ninja raised the issue of vendor lock in and yet I find it interesting where people seem to be quite happy to vendor lock themselves into an operating system by heavily reliant on proprietary technology and yet when it comes to cloud computing they take a contradictory approach; that some how vendor lock in is ok for the desktop but when it comes to cloud computing it isn’t.
What makes desktop/server/traditional computer so absolutely different from cloud programming where someone can hold contradictory views on the matter? if open standards are important in one area – then shouldn’t they also be important in another area? if not – why not?