I attended VM World last week, and as you might imagine, it was “cloud computing” this and “cloud computing” that the whole time. The hype factor for the cloud is in overdrive right now. But is it warranted? A lot of people, even tech-oriented ones, outside of the data center sysadmin types, wonder what all the hype is about. I’ve come to believe that cloud computing is major computing revolution, but for most computing users, it’s an invisible one.
Geek ambivalence about cloud computing is interesting, because it’s not like it’s a new phenomenon. In tech years, the idea has been around for ages. But part of the problem is that the actual definition of cloud computing isn’t really all that easy to pin down. And marketers have been fond of talking about things being cloud computing when it’s really only peripheral, and it’s really a shameless ploy to capitalize on a hot trend. But in a nutshell, here’s what I think is the essence of the cloud computing concept.
There are two technological innovations that, available together, make cloud computing possible: ubiquitous internet access and advanced virtualization technology. With virtualization, a “server” doesn’t have to be a physical machine. In the olden days, if you wanted a server, you had procure a physical machine, or access to one. If you thought your needs would scale up, you would get a more powerful machine than you currently needed, just in case. And once you came close to outgrowing that machine, you would need to either get a new machine and migrate your system over, or scale out to more machines, by spreading components, such as a database, off to its own server, or doing load balancing between two. System administrators were constantly trying to strike a balance between paying for capacity that would never be used and dealing with problems or outages caused by usage spikes due to not scaling out quickly enough. And scaling out was sometimes very hard. Moving a running, missing critical system from an old server to a new, faster one was no picnic.
Virtualization made it possible to decouple the “server” from the server hardware. So if you needed more capacity (processor cycles, memory, or storage) you could scale out your server to new hardware, even if that meant moving to a new data center in a different part of the world, without all the fuss. And the ubiquitous network made it easier for the people who used these services to access them even if IT managers started to aggregate them into centralized “cloud” data centers. So this meant that a small startup could order a server from Amazon and never have to worry that they didn’t order one that would be powerful enough if they hit the big time. A researcher would be able to build a system to crunch some numbers for three weeks, then just delete it when the calculation was done. And a large company’s IT managers could start to decommission the various server boxes that were spread out in closets in offices around the country, and instead provision instances from their centrally-managed “private cloud”.
I think the reason that so many geeks don’t really understand what the big deal is over cloud computing is that unless you’re running a big data center, you’re not really the one who’s reaping the direct benefit of cloud computing. I blame the marketing, to some extent. We hear about various cool web services, like Evernote or Dropbox, or even “operating systems” that depend on “the cloud”, such as ChromeOS or eyeOS. (By the way, use Evernote and Dropbox.) But from the point of view of the end user, cloud computing is just a fancy word for web hosting. If I’m using Dropbox, I don’t really care if the storage is on the cloud or on a big old-fashioned storage server somewhere. As long as the service is reliable, it doesn’t matter to me. But is sure matters to the poor sap who has to maintain the Dropbox servers. Cloud computing makes it much easier for the company to grow as it needs to, even change hosting providers if necessary, without disrupting my service and without wasting money on unused capacity “just in case.”
I guess the other big recipient of the value of cloud computing is the accountant (which would be another reason why the geeks wouldn’t really get it, unless you’re an accounting geek). Another buzzword that’s commonly associated with cloud computing is “utility computing,” which basically means that you pay for computing resources as a metered service, just like you would electricity. For the CFOs of the world, it means that you don’t spend a bunch of money on hardware that you may or may not be extracting full value out of. I think it’s safe to say that most large companies only end up using a small percentage of the computing resources that they have sitting in the racks and on the desks in their buildings, so from an efficiency standpoint, it’s better to pay for what you use, even if theoretically you’re paying a higher rate for each unit of potential processor cycle. The old way wastes time, money, and electricity.
So this is OSNews, and we primarily concern ourselves with operating systems here. Where do OSes fit into this new world? Well, virtual servers are still servers, and each and every one still needs an OS. What we’ve done is insert a “sub OS” called a (Type 1) hypervisor under the regular OS, and that hypervisor allows one or more servers to run one or more OSes or OS instances. You could have one OS instance spread across multiple physical machines, or hundreds of OS instances on one machine. A Type 2 hypervisor allows a guest OS to run inside a host OS, which is also useful, but is used for a very different purpose. Depending on the platform, a VM can be moved from one type of hypervisor to another, so you might configure a new server in a VM running as a guest on your laptop then transfer it to run on a “bare metal” hypervisor in a cloud hosting environment when you launch to production.
One aspect of the OS world that’s made more complicated by cloud computing is licensing, and Microsoft in particular is in a kind of difficult position. One of the advantages of cloud computing is that you can turn server instances on and off willy nilly, as you need them. You can copy an entire instance from one datacenter to another, or clone one and made a derivative. But when you have to deal with whether the OS and software on that VM you’re moving around is properly licensed, it adds a whole lever of unwelcome complexity to the mix. That’s one reason why Linux, MySQL and other open source software has been very popular in cloud environments.
If you buy cloud hosting from Amazon, they just build the Windows license into the fee if you order a Windows instance. But if you’re using a lot of capacity at Amazon, you end up getting kind of a bad deal, and you’d be better off buying your Windows server licenses at retail and installing them yourself on Amazon’s VM.
And virtualization technology is getting bundled with operating systems more and more. Microsoft has its own hypervisor, which is included with Windows Server 2008. It’s just one of the commercial virtualization platforms that’s available today.
Another reason why cloud computing is an invisible revolution is that a lot of what’s happening lately is in the arena of the “private cloud”. OSNews’ trip to VM world was sponsored by HP, which is putting a huge amount of effort into helping its enterprise customers replace their current infrastructure, which could easily be described as “servers here, servers there, servers we don’t know about, we can’t keep track of them all, and we certainly can’t properly maintain them all”. And one of the reasons why the server situation is so chaotic at big companies is that when someone needs a new server for something, and they contact IT, they get the runaround, or they’re told they’ll have to wait six months. So a lot of the innovation recently is around helping big companies set up a centralized data center where the whole thing is a private cloud, and when someone in some branch office needs a new server, one can be provisioned with a few keystrokes.
The people ordering the new servers don’t even need to know it’s a cloud. They don’t care. All they know is that suddenly their IT people are getting things done a lot quicker. So again, to the outsider, it just looks like regular old hosting, or regular old IT provisioning.
So what about the so-called cloud OS? Where does that fit in? I’m afraid a lot of that is marketing hype, because for the user of a cloud OS, it doesn’t really matter whether the apps and storage they’re accessing over the network are stored in a cloud or on a regular old server. But the reason that it’s meaningful is that it would be impractical for any company to offer a server-based desktop user experience without having cloud computing backing them up on the server side. It would just be too difficult to deal with the elasticity of demand from the users without the flexibility that comes from virtualization.
I think the reason for the marketing hype is that people are inherently wary about their “computer” not really existing inside the actual case that’s on their lap or under their desk. Both novice and advanced computer users are nervous, though for different reasons. For some reason, the idea that their computer exists “in the cloud” is just inherently less scary than “it’s on our server, in a rack, in our datacenter, in California”. Though in reality there’s barely any distinction. And until “the cloud” becomes the only way that anyone hosts anything, like at some point movie studios stopped advertising that their films were “in color!” I think marketers will still make the distinction.
But don’t let the hype and misdirection confuse you from the real issue. We’re in the midst of a huge revolution. And one of the reasons that a lot of people fail to appreciate the big deal is precisely why it’s a big deal: for the end user, the move to cloud computing is supposed to be transparent and painless. Even for the programmers and power users and other geeks using these systems, they’re just working like they always used to work, and that’s the whole point.
Heh I’m pretty sure the datacenter sysadmins are wondering what it is all about too… usually there are at least slightly paranoid and probably will never trust the cloud with any significant data.
Hence the focus on private clouds. The benefits of the cloud without entrusting your infrastructure to a 3rd party.
You didn’t read the article, did you?
To me the worst aspect of the “cloud” as it is sold now is that you are wasting perfectly powerful computers with plenty of storage as thin clients of overloaded servers. I guess storage makes more sense, but only for backups.
Let’s not forget that the “cloud” was standard practice once, and it lost to personal computers.
Wake me up when they get more aggressive and say start selling company controlled “cheap” computers that would load a synced environment for that individual user but at the same time use “his” computer as a backup server for other people’s data. They only need P2P technology, DRM, buzzwords and a shitload of cash.
Umm that sounds a lot like Google Chrome only more evil.
The only app, I’ve seen, that really takes advantage of the cloud is Panda’s Cloud AV. It makes AV scanning a group effort. It’s like FOSS development for virus scanning, and the concept is really cool from a security stand point.
http://www.cloudantivirus.com/en/
Zonbu is already kind of doing that. You can buy a thin client from them, pre-configured system, or use your own system to access a virtual desktop.
http://www.zonbu.com/whatiszonbu/
There are many legal issues, user issues, and licensing issues which prevent cloud computing from living up to your expectations. For instance, botnets could be considered a type of cloud computing, and we are all aware of how those are used.
for argument sake, lets call the opposite of “cloud computing” “ground computing”. “Ground computing” then means running an OS installed on the ground machine and let it process data stored on a same machine and then store the processed data back to the same machine, everything is basically done locally.
“Cloud computing” to the most part means the OS itself or the data to be processed or processed data or the crunching of data occurs somewhere else on the internet, on someone else’s hardware.
Cloud computing is a bit problematic to free software advocates. You cant really say you are a king of your own data if it sits on somebody else’s hard drive somewhere on the net. How is that different from having the same data locally but in an proprietary format?
Data format and data location are different enough. The issue is trust and accesibility. Trust [towards the provider] that it stores your data losslessly (no data loss), and securely (nobody else can access your data). Trust that it doesn’t go bankrupt and disappear. Trust that it provides constant accesibility, i.e. no service stops because of maintenance or power outage, no service outages whatsoever. Trust in your ISP that you have constant internet access, no interruptions whatsoever. Trust in every other ISP wheverer you may travel, that it also provides constant high rate connections towards your data provider.
That’s a lot of trust.
There’s a long, long way to go till companies can provide that level of trust.
http://klotys-welt.blogspot.com/2010/05/dont-believe-hype.html
I believe David’s point is that there are two kinds of usage for the term “cloud computing” and the one everyone is wary of (thin clients, hosted offerings, software as a service, etc.) is quite different from the one that’s already gaining lots of popularity (Using “private clouds” to make provisioning much simpler and minimize waste).
Edited 2010-09-07 22:57 UTC
Cloud computing has it’s merits, but it can also be a trap, if you’re not careful. Delivering your data to someone outside of your physical scope means that, sooner or later, you will have problems with your data (as always), but you will be unable to step up and deal with it yourself. You’ll be at the mercy of some other company and that is ALWAYS a bad thing, because you’re just another customer, your data isn’t as valuable to them as it is valuable to you.
None the less, cloud computing is excellent for volatile things. For example, a marketing company can rent computing and bandwidth resources on a cloud computing vendor to deal with a summer campaign for two weeks, instead of spending money on new servers. It’s a good thing when you have a scalable IT architecture.
Also, cloud computing is just a fancy-just-invented-now-so-it-must-be-good name for grid computing. I yawn when I head “cloud computing”.
The problem with cloud computing is the same as when IBM introduce the concept (though they didn’t call it cloud computing) in the 1950’s and the solution is the same: if it’s critical to your business, do it yourself regardless of cost. Otherwise, go with the most-reliable, cheapest solution.
Cloud computing ‘should’ be invisible to the end user.
When it comes to hosting very few businesses need more than a VPS.
Managed dedicated hosting is nothing new, you get a single instance and the hosting company handles the rest.
Cloud computing is more useful to companies that require a heavy amount of shared processing. But even in many of those cases it is not going to be cost effective given how cheap gigabit and multi-core cpus have become.
Cloud computing is supposed to bring in the age of the web app but people said the same thing about RoR, Java and AJAX.
Cloud based backup services are useful but that is nothing new either.
I saw an endorsement somewhere for cloud computing that stated small businesses now have access to the same level of data centers as fortune 500 companies! And why the hell would the typical small business need access to that type of data center?
So overall I think cloud computing is mostly hype even though it is useful in some cases. If it gets popular dedicated hosting companies still have a lot of room to cut their prices.
I just clicked on an ad here for cloud hosting at the planet.
The offer is $50 a month for which includes 1 vcpu. Well how much power is one vcpu? If it is 1/total_instances that means nothing to me. For $180 I get 2 vcpus. When will I find out if I need 2 vcpus? When the company arbitrarily tells me? Oh joy, just like shared hosting.
Usually they will give a rough CPU equivalent, e.g. 1 vcpu = 1 Core 2 Duo, or something. Of course, you probably don’t always get that much.
Much like real clouds, the white puffy kind is inherently much like cloud computing. The sad thing is there are always a group of marginally employed tech assholes that will buy into the latest fad in IT and repeat it over and over again and eventually the community moves on to the next idiot stage of its existence… “The next big thing.”
…because “cloud computing” is simply distributed systems. Not a bad idea, nor particularly revolutionary. The main difference today is that the logical servers can be slightly more abstracted from the physical servers than previous.
Besides that – my data will be stored locally, thank you very much
I for one love virtualization. I actually wrote a big thank you blog at work praising our sys admin guys. After reading this article, I think a lot of it is due to virtualization. We are fully virtualized. Every test server… everything. I want something up… boom it’s there in 1/2 hour.
I’ve always wanted to just have my work desktop hosted as well and I just vnc/remote desktop in. They give us powerful laptops to work with… but I just use it as a thin client to my desktop.
That all said, what I still don’t see… and what I don’t want is… web apps. They are good for certain things. Yet, I sometimes have to fight to say… ‘not everything should be a webapp’. The biggest problem is that you have to then host the application on some webserver. That’s great if you as a company are willing to devote the time, money, and resources to support that. Yet, a lot of the time, you just don’t have that and a local program is much more usable.
You do end up with more complex problems with webapps… especially scaling it all out.
You have to be really careful with apps that really are not collaborative or web dependent, but you think its going to be easy to deploy… so you make it a webapp. 9 times out of 10 you’ll end up making things more complicated.
Granted, most of our customers are still ‘techy’ so that might be a different perspective… they have no problems running an exe or something.
Even for smartphones, the local client tends to win. The ‘web’ was supposed to be do anything anywhere, but it’s really poor as a programming and integration environment.
Hi, you Worte “There are two technological innovations that, available together, make cloud computing possible:”
I totally agree but want to add that without service orientation inside of the apps in the cloud you won’t have the right services. So the ROI won’t be as high as if you have those services. So, you need a former hype “innovation” – SOA – which is mainstream today
There is one of my Blog items “Cloud-SOA=Zero” at
http://www.x-integrate.com if you want to read more about this topic.
The last entry mentions a Blog which is only available in German, sorry. But on the site are a lot of regarding infos for this topic of Cloud/SOA available in English too
Best
Wolfgang
what a Cloud is, or what it’s all about. You all just dumbly focus on business use cases. Even with the business use-cases, you don’t fully understand the big picture.
It’s like a bunch of undergrad physics majors trying to criticize Hawkings.
Then enlighten us and explain what it is, in your point of view.
(In my opinion, it has become a buzzword describing just about every online service nowadays, but maybe it has kept some precise meaning for someone…)
Well then take this opportunity to enlighten us.
Is it Skynet then?
Business use cases are being provided because we are talking about whether or not it is useful to businesses.
Furthermore the business use cases described here fit the services offered by Amazon’s EC2. Perhaps Amazon doesn’t understand cloud computing either?
I’ve examined the issue of cloud infrastructure implementation and management from both sides – from the point of view of those deploying solutions as well as those consuming their services. My final determination is:
I really don’t know clouds, at all…..
Humor an old man!
Or maybe what you own. I like the cloud for some things, like my address book or my my email, and generic Apps with a nice AJAX interface, But I have several usb drives (16 – 32 GB) that sit on my keys and are usually rsync’d to a folder in ~/Key_Drive – It may be insecure and possibly innefficient or it may be perfect, because I cannot do something trusted on ‘your’ machine. And I do not expect trust on ‘Their’ Data Center. If I lose my key_drive, it is a $30 commodity loss and I get full ROI by the third or fourth usage. None of what I do publicly is expected to be private. unless I elect to encrypt it – which triggers many U.S. Government red flags (and only if I mail you my ‘key’ or put it on a key drive and then hand deliver it to you I can’t really expect privacy or security)
I guess for the user the Cloud is New and Great and (dare I say) Sexy but it will always feel like a thin client on a leaky network.