It’s official, boys and girls: it’s easier to kick in a door when it’s open. “A test has revealed that a Linux server is far less likely to be compromised. In fact, unpatched Red Hat and SuSE servers were not breached at all during a six-week trial, while the equivalent Windows systems were compromised within hours. However, patching does make a difference. Patched versions of Windows fared far better, remaining untouched throughout the test, as did the Red Hat and Suse deployments.”
Who the heck doesn’t patch their servers? This article is moot.
Well – actually there is a lesson to this. It shows that just putting up a Windows server and then running Windows update is not enough, as the server will probably infected before you are done.
I once set up a Win Server 2003 testserver in a Vmware environment. The server was compromised way before the the updates were applied. It was a pre-SP1 setup though, so I wonder how a Server 2003 SP1 would fare.
But still it shows that you should harden your computer, before ever putting it on a network.
Edited 2006-03-10 14:55
Who the heck doesn’t patch their servers?
People who want them to remain stable, and who don’t want to patch software which is completely unrelated to what function the server performs and what software they actually use.
They also don’t even want to patch the software they do use right away, but want guidance on how they can minimise the risk while they plan getting a patch installed.
The article is by no means moot.
So show me a Windows patch that makes Windows unstable.
I know a lot of companies that haven’t yet moved to sp2 because they haven’t tried their programs enough against it. This isn’t the servers though, but desktops.
And what does that have to do with patches that cause stability problems? It’s not Microsoft’s fault that some 3rd-party software was written poorly and now won’t work in SP2.
Tell all that to the people that upgraded to Windows XP Service Pack 2 to find out some applications don’t run anymore… Now imagine that the application no longer running is something you use to run your business.
Also, the web based parts of un-named crappy financial software, uses active-x controls. SP2 makes Active-X difficult:
http://www.winsupersite.com/reviews/windowsxp_sp2.asp#problems
Can we say whoops, time to pull out the tapes?
Also, here in our environment on our intranet… we had novell netmail embedded in an iframe so it stayed within the portal. Microsoft updated internet explorer and messed up with iframe pass through authentication. Within a day, we had 6,000 requests to our service center from angry flight attendents / pilots not able to access their email. Time costs money my friend and that patch from microsoft cost our business money. Even if a simple registry hack fixed the problem, it wasted quite a bit of our time.
Edited 2006-03-10 18:02
So again, how is that Microsoft’s fault that some people’s business software has been poorly-written?
After the SP2 “OMG NO IT BROKEZED MY WARES” hype subsided, the actual percentage of software tested not to work in SP2 was something like 1% — and it was all because the software made stupid assumptions, used stupid hacks, or expected certain things to behave in a certain manner. It’s kind of like writing software that uses a temp directory, and rather than using the %TEMP% environment variable, you hard-code it as C:Documents and SettingsAdministratorLocal SettingsTemp.
Hmmm, I don’t know about what kind of “unstable” you’re refering to, but MS rolled out the Windows 2000 Security Rollout about 6 months back, and it broke a lot of functionalities, one example was the network reporting of NAI Security Suite for SMB, I personally had that problem happen. They had to withdraw it from deployment, and ready a new version.
You can can call that “unstable”, ’cause it de-stabilised the server functionality…
Granted … that is one update that I distinctly remember causing problems.
There really is nothing else that I can remember.
These are the kind of tests we can do without.
This test is important for SMB company’s. They tend to not have an IT staff and will let their server sit for years without patching. (for not knowing better). Although, hopefully the IT company/person that setup the server intially would set update on auto.
They won’t let their server sit for years without patching. They learn to patch after it stops working a few hours after each fresh install a couple of times.
In other news, studies have shown there is a high chance you’ll die if you jump out an airplane without a parachute.
Now back to you, Captain Obvious!
there are more people/worms trying to hack windows than unix/linux/bsd. So of course its going to get hacked first. duh.
It has yet to be proven that how frequently an OS is compromised has to do with it’s popularity. IMO that mind set is entirely wishful thinking since good design is as much a part of security as obscurity is.
Homogeneous system with flaw A + worm that exploits A = fun.
Heterogeneous uniformly distributed system composed of elements with distinct unshared flaws A, B, and C + worm that exploits A = less fun.
Hetereogeneous system composed of elements with distinct unshared flaws A, B, and C where 90% of the elements have flaw A + worm that exploits = I can’t believe it’s not homogeneous!
If you take it as a given that despite any efforts with respect “good design” that there still exists some flaw that can be exploited, then it’s clear that the effects of this flaw will be more significant the more uniform the environment in question is. If you’re looking to maximize distribution to wield the computing or network resources of the most systems, or simply cause the most havoc for entertainment or blackmail, it’s pretty obvious that the best target is going to be the dominant one. Even if you aren’t motivated by dominance, your efforts are going to see the most damage because the distribution is so lopsided by Windows’ share of the market.
Lets try that again…
Windows right now is a big target, but saying that’s the case because its the dominant OS on the market is terribly flawed. Windows’ pseudo-permission system is an easily circumvented hack, the whole system can be infected from any user account with ease. Windows has flaws the likes of which have never been seen before on other platforms such as making it possible to embed code in picture and video formats (remember that, http://www.google.ca/search?hl=en&q=wmf+exploit+site%3Aosnews.c…), and neat dangers such as VBScript and ActiveX which provide too much power without a proper sandbox, “Do you want to allow scripts from this site to run? [yes if you want it to display properly/no if you want the site to be broken]” just isn’t adequate security. Windows security updates aren’t always on time, some things never get fixed.
You cannot say that isn’t a large factor in the security of an operating system.
Windows’ pseudo-permission system is an easily circumvented hack, the whole system can be infected from any user account with ease.
Hmmm. Not that I don’t believe you but I’ve never heard this before. Any articles you wish to share about this?
Your comment regarding Windows’ filesystem permissions is plain wrong.
The ability to embed exploits within data files is hardly novel. See zlib for just one example.
The attention to security and flexibility designed into systems will provide tradeoffs in possible malicious software. A lot of aspects of Windows wasn’t designed with resilience toward trojans in mind. It is however trivial to see that the reason people spend picking the low-hanging fruit of Windows is because it’s useful for their purposes to do so, and exacerbated by its popularity. Even if the number of flaws were reduced 99%, finding those flaws would provide the most reward and their impact would be the greatest.
“Your comment regarding Windows’ filesystem permissions is plain wrong.”
Not it wasn’t, although I may have accidentally mislead you with my wording. The problem with user accounts in Windows right now, and the reason they’re a dirty hack is because users who don’t have administrator privileges often end up with files, folders and shortcuts on their desktops which they have no ability to rename, move or delete. Because of this the users usually get rather angry and in the end either you give them aministrator privileges so they are once again in control of their desktop, or they simply get administrator priviliges without asking.
Have you ever booted XP home in safe mode, the adminsitrator account is just sitting there with no password. Supposedly you can set it during the installation, but that’s not indicated at all in the Windows XP home installer and most people using Windows XP home don’t know about it at all. This is one of a few ways users can increase thier account privileges.
“The ability to embed exploits within data files is hardly novel. See zlib for just one example. “
Zip files don’t just run by being embedded into a web site, images that took advantage of the WMF vulnerability did. Microsoft even unknowingly facilitated this by intentionally making the mechanism available themselves. Microsoft showed that at in a best case scenario some of their legacy code which was never reviewed and is still in current versions of Windows was never designed with security in mind.
“A lot of aspects of Windows wasn’t designed with resilience toward trojans in mind.”
Windows doesn’t need trojans to get infected, the WMF vulnerability was exploited without the user having to install anything themselves. And what’s more, you argue that Windows which wasn’t designed with that in mind fairs just as well with it’s current user base as a system with security in mind would at the same user base. Design is important, especially when it comes to automated spreading of malware such as how viruses get around. Mac OS X did have flaws recently, but they don’t qualify as viruses, the only viruses around for OS X and Linux are proof of concept viruses which wouldn’t get very far in the wild because of how fast patches are put out and in the case of Linux how well user privileges are implemented. Plus for those who are absolutely paranoid there is AV software for Linux, and I’d be surprised if there wasn’t any of OS X. That said viruses are a good example of poor security-minded design but a bad benchmark for vulnerability because AV companies hype people up to the point of wetting their pants over the latest viruses in order to increase sales.
“Even if the number of flaws were reduced 99%, finding those flaws would provide the most reward and their impact would be the greatest.”
So how come it’s taking so long for someone to come up with a way to get Windows XP working on MacBook Pros, or for that matter why can’t I simply download an ISO of a recent version of OS X that I can install on my generic X86 box just as transparently as if I was running one of Apple’s very own boxes. Simply put crackers are imperfect human beings like the rest of us, and if there were less flaws it would take them longer to find and exploit them. My comments on Microsoft leaving some things unpatched, and some other things too long without a patch relates very much to this.
“Even if the number of flaws were reduced 99%, finding those flaws would provide the most reward and their impact would be the greatest.”
The only rewards I can think of for cracking are:
1) If you want to set up a zombie network.
2) If you’re a bored script kiddie.
3) Stealing private data.
4) If you’re an experienced cracker testing your skill.
Number 1 is a big one, and in that case going after the dominant OS would be advantageous; however, Windows is most frequently infected either by viruses or trojans which are far easier to implement for Windows than any other modern OS I can think of. Linux and maybe Mac OS X (which I’ve never actually used, that’s why I said maybe) require the root password to get anything to install beyond the scope of your user account. If you’re root you usually know what software is good and what software isn’t. Someone really wouldn’t put a lot of work into breaking into each computer individually because that would mean a very slow growing zombie network.
Number 2 relies on Number 4 to do much of anything, they are large in number but small in skill and knowhow. Often just bored kids they have to take advantage of tools created by Number 4 to do 90% of what they think they know how to do all on their own. The only reward for them is the illusion that they are l33t. They have no control over which platform will be the easiest for them to break into, it’s a matter of which platform has the longest delay for patches.
Number 3 targets all systems already and the best target isn’t your home computer but rather company servers. These systems have a much more distributed user base than workstations and desktop computers do. There’s not much of a monoculture there, and so there’s much less reason to specifically target Windows.
People from group number 4 usually discover security holes and privately notify the developers, giving them anywhere between a week and a month to fix the security hole(s) before the public is told and software, APIs and example code comes out showing group Number 2 how to exploit those security holes. In the case of Linux security holes are often publicised very quickly without keeping anything secret and patches usually follow promptly and/or a workaround is immediately disclosed. Microsoft has the worst track record here that I know of, although admittedly there may be no operating system with a perfect record here.
That said, I don’t want you to feel that I’m saying Windows is swiss cheese. Given an firewall, adequate defensive software, and a knowledgable administrator Windows can be “safe enough” IMO.
Not it wasn’t, although I may have accidentally mislead you with my wording. The problem with user accounts in Windows right now, and the reason they’re a dirty hack is because users who don’t have administrator privileges often end up with files, folders and shortcuts on their desktops which they have no ability to rename, move or delete.
User accounts in Windows aren’t a dirty hack. The craptastic Win32 software designed to target Windows 95 are dirty. Some of the decisions regarding XP Home and file permissions were poor. Ever encouraging the use of FAT32 on NT systems was also stupid. However Windows, and especially as it pertains to the topic at hand, does not have file permissions that are trivially circumvented automatically.
Zip files don’t just run by being embedded into a web site
Unfortunately a zlib problem becomes a problem of libpng, apache, and so forth. A libjpeg problem becomes a problem of everything that links to libjpeg. I’m telling you that bugs in libraries that process data files are not novel.
And what’s more, you argue that Windows which wasn’t designed with that in mind fairs just as well with it’s current user base as a system with security in mind would at the same user base.
You can have any discussion with yourself that you want, but don’t attribute it to me.
Mac OS X did have flaws recently, but they don’t qualify as viruses, the only viruses around for OS X and Linux are proof of concept viruses which wouldn’t get very far in the wild because of how fast patches are put out
When it comes to a production environment, a “virus” takes a pretty massive back seat to any means of privilege escalation. That said, viruses wouldn’t propogate rapidly between MacOS X and Linux installations because there are significantly fewer hosts with sufficiently different usage models from the distribution of Windows viruses, that it would be easier to break into Apple or $LINUXVENDOR’s network and distribute them that way than to expect them to be distributed in the wild. If you’re going to do that, then you might as well prefer subtle backdoors. There isn’t a lot of Linux warez to distribute via Kazaa. Worms would have a much better success at spreading on these hosts.
Plus for those who are absolutely paranoid there is AV software for Linux, and I’d be surprised if there wasn’t any of OS X.
For the most part this software is used to scan files for viruses that are predominately targetting Windows. Maybe the occassional malicious Office macro that would be a pest for others using Office.
So how come it’s taking so long for someone to come up with a way to get Windows XP working on MacBook Pros
Most of the people trying to run XP on x86 Mac if you look at their comments in forums aren’t precisely the sharpest tools in the shed. This isn’t a “security” measure on the part of Apple or Microsoft, it’s making an operating system that was shipped to require the PC BIOS operate without it, with half of the people trying just copying files from Vista betas or screwing around with partitions and installation sources. On top of this, it’s a really small market that has gone out and bought an x86 Mac to run Windows on it.
or for that matter why can’t I simply download an ISO of a recent version of OS X that I can install on my generic X86 box just as transparently as if I was running one of Apple’s very own boxes.
You can run OS X on a decent quantity of hardware if you want. I don’t know what you think the limitations of available drivers has to do with the inherent results of homogeneous computing on the impact of security flaws. You can’t just pull a device driver out of thin air, but that isn’t security.
Simply put crackers are imperfect human beings like the rest of us, and if there were less flaws it would take them longer to find and exploit them.
Though your examples thoroughly have nothing to do with this, it is plainly obvious that the more esoteric flaws are in a system, the more difficult they are to uncover. And once a flaw is uncovered the proverbial shit hits the fan and people scramble to update 90% (far less because a bulk of them are in the hands of people who wouldn’t know up from down in these matters) of the computers in the world because they’re all vulnerable. These flaws though are endemic of the software development practices used on all platforms. On top of any and all mental failings on the part of Redmond, the bulk of the damage can be done by doing it to Windows, and that is where the greatest malware interest lies.
Windows is most frequently infected either by viruses or trojans which are far easier to implement for Windows than any other modern OS I can think of.
Short of an operating system that only accepts programs written in brainf–k, or a system that relies on signed binaries, trojans are equally-easy to write for all major platforms. Viruses aren’t anymore especially easy to add to PE files than any other format, they just propogate more thoroughly because
1. Most Windows users are afraid of a multi-user operating system, or are impeded by the broken software they use from using it properly
2. Most illicit file trading (the easiest way of distributing a virus next to e-mailing them to people that don’t know any better) that doesn’t trade RIAA/MPAA content consists of programs that run on Windows. The 50 million unlicensed users of Photoshop can’t be wrong.
Writing a virus in a multi-user environment with a 3-5% share of users with the same impact of one practically running in a single-user environment would require the scanning and exploitation of local security flaws which would involve more complexity. If you can obtain privilege escalation on a server, you might as well steal anything of value on the system rather than copy your payload to ls (where it’s bound to see significant redistribution!) That doesn’t mean that the viruses written for Windows or DOS before it are trivial, because some of them are quite clever.
Number 3 targets all systems already and the best target isn’t your home computer but rather company servers.
No, these are both excellent targets. The home user is constantly buying things from ebay, amazon, and so forth. They have the worst record-keeping and the least competetent administrators. And probably the best amateur porn around.
That said, I don’t want you to feel that I’m saying Windows is swiss cheese.
It really has nothing to do with Windows being insecure or not. It’s merely a matter of recognizing that interest and impact follow usage, and a homogeneous computing environment relying on the current methods of providing system integrity is always going to be the prime target. A sufficiently-constrained system with different design priorities, say one that required signing and proof-carrying for example, would be a more difficult target than even the most up-to-date Windows XP Home or MacOS X installation. But despite the challenges that would impose, if that were the single system 90% of the market turned to it would be just as appealing and the ramifications of flaws would be just as severe.
I’m going to give this one last try, if you continue arguing I’m just going to ignore you after that because I’ve got other stuff to do.
You’re argument that Windows is less secure because it’s more popular is a common unproven theory which I consider flawed to the point of being a blatant lie.
Yes Windows has a huge user base, but everyone taking advantage of that is competing over that user base. People don’t want botnets on machines infected with viruses and all sorts of other malware because those would be crippled if there was too much garbage on them and they’d be worthless. The same goes for virus writers, they don’t care for computers that are already full of viruses or malware because they would be more or less crippled and wouldn’t have much bandwidth to spare for propagating those viruses. I doubt hackers get much in the way of bragging rights or an adrenaline rush out of cracking yet another Windows box.
I don’t know exact figures, but I would estimate that Linux and Mac OS X each have a minimum user base of one million, that is hardly obscure and it’s not hard to find on the internet. That is a target which according to you is untouched because it’s “obscurity” makes it secure. According to your line of thought that they would be just as insecure as windows given the same user base it must then be assumed that crackers would be able to exploit those systems just as easily as Windows. So why then would they compete over a crowded market instead of going after a clean one, why fight amongst each other over the Windows machines instead of going after what is an unspoilt mass of Linux, Solaris and OS X machines? The answer is simple, it’s harder, and it’s harder because those operating systems are more secure by nature, not by obscurity. As for hackers, don’t you think they’d get bigger bragging rights for breaking into a system that people consider more secure like Linux, Solaris or OS X than they would by breaking into a Windows system like every other hacker and script kiddie out there, I imagine they’d get a far greater kick trying to break through SELinux than they would going after Windows.
See that burning thing crashing to the ground, that’s your argument. Now please stop being irrational. I’ve layed out my case flatly right here, and no it’s not a direct reply to your above post, it’s a reply to your main points throughout the entire argument. I won’t be lead on a wild goose chase until you’re either too frustrated to continue, or you somehow think you’ve proven yourself right.
Oops, this was supposed to be attached to “Get A Life”‘s reply to my post.
Consider for example DOS, which has no security model. What is the interest in providing DOS malware now, compared to 1994 when significantly more people were actively using it? What percentage of people creating malware look for flaws in network-capable DOS programs in 2006?
And my pet rock is more secure than Dos :-p .
First off I never said that user base had nothing to do with security, simply that it was an over optimistic excuse overused in the defense of Windows. Secondly you’re cheating, the playing field isn’t even. Dos is neither internet capable out of the box, nor is it in use outside of niches where it’s rarely if ever subjected to any attempts to circumvent it’s security. Windows, Linux, Mac OS X and Solaris are all exposed to the internet, they all have big enough user bases that you can find people using them on the internet and they perform the tasks needed of them. Dos just doesn’t compare at all.
Dos is completely out of the picture, when people or companies are getting new machines they aren’t going to waste a fraction of a second thinking about dos when they have to choose an OS, Dos cannot perform 99% of the tasks required of modern day servers, workstations or desktops. Dos came and went, if you want to compare Dos to something why not compare it to an Apple IIe or a Commodore 64 (not really all from the same time, but you get the idea).
You can go ahead and argue beyond logic, but you can’t win in those cases as my pet rock example demonstrates.
DOS has been network-capable for a long time, and anyone that’s using DOS today for desktop computing (?! as a server ?!) has network access. The reason that DOS doesn’t compare is because it isn’t used outside of the people running Windows 9x, except in embedded areas and for providing boot disks for flashing hardware. If I thought you knew considerably much about security I would posit any number of single-user operating systems with no meaningful security models that are equally irrelevant to platform-specific malware because the userbase is essentially irrelevant. They lose-out on the software-specific flaws because there isn’t any software. They lose-out on the service-based flaws because even if they run the vulnerable services, no one’s going to write the necessary exploit. Ah, but in its day DOS was the single-largest target there was for every piece of malware. And then Win 9x (again no meaningful security model, but only obtains carry-over malware that works on Win32 in general provided the malware doesn’t depend on something not available on Win 9x). And now, no one cares because it’s not the king of the pile anymore.
Ok, so you’re still arguing that obscurity is THE security. In that case I’m going to write a program full of security holes and lock it away in my closet, then I’m going to tell companies about it and they’re all going to want to buy it and use it because locked up in my closet it’s the most secure software in the world. Your argument can only go so far, and I already said that I agree that obscurity is part of security, but how the code is written is also a large part and Linux and Mac OS X users number enough that they aren’t obscure. Please stop now, you’re defending Windows with an illogical, endless and pointless argument. You’re reasoning is fine in your own world, but to an actual market it’s just bullshit.
“DOS has been network-capable for a long time”
Lying by omission. Network capable doesn’t mean it can connect to a network or the internet out of the box, every flavour of Dos I’ve used way back when needed special software or drivers to be installed manually depending on who’s Dos it was. Furthermore Dos is not used directly connected to the internet any more, it’s used on businesses with their connections going through routers, they are very sheltered from the internet and don’t connect directly to it.
“The reason that DOS doesn’t compare is because it isn’t used outside of the people running Windows 9x,”
You have got to be kidding me, you’re grounds for a Dos to modern OS comparisson being fair is that all Windows 9x users are still Dos users! Dos is part of their boot sequence and Windows sits on top, but they don’t count as Dos users, they are Windows users because they are using Windows. Dos on it’s own is isolated to a very miniscule niche and you cannot compare it to Windows based on security. If you want to have your double standard then to make things fair every Windows 9X flaw should also be considered a Dos flaw thusly adding Windows 9x poor security record onto Dos’ record.
“except in embedded areas and for providing boot disks for flashing hardware”
Oh yeah, and we all know people run those on the internet all the time, they just whip out their Dos telnet clients and pretend it’s the good old days again.
” If I thought you knew considerably much about security “
You think I don’t know much about security, look at your arguments! I’ve resisted being blunt with you as much as possible and I expect the same. Calling me names because you’re loosing is something I’d expect form a 13 year old.
“They lose-out on the software-specific flaws because there isn’t any software”
I know that, but companies cannot use computers without software. You can carve an RJ45 port and a Windows logo on a rock and say it’s the most secure computer in the world but that won’t do any good because it’s of no use to anyone. Current versions of Windows, Linux and Mac OS X are all usefull today and they are modern operating systems capable of running modern software. That is why they can be compared. Dos is an antique and cannot be.
“Ah, but in its day DOS was the single-largest target there was for every piece of malware.”
In it’s day, that’s past tense. If people started picking up Dos again in large enough numbers it would be swiss cheese so fast you wouldn’t know what bricked your computer. Obscurity is false security, it’s only temporary. Well designed code is the real deal because by its very nature it’s less vulnerable.
“because it’s not the king of the pile anymore.”
And where’s it going to be in five or ten years? Nowhere. It’s numbers are already dwindling down to commercial machines companies are too cheap to replace. When software is too old to be of any use any more people drop it and it’s like it never existed. Only history books and fond old memories keep it alive.
Now tell me, how does any of this logically relate to comparing the security of current versions of Windows to current versions of Linux or Mac OS X. That was the original argument and you’ve trailed off on this rediculous charade in order to make excuses for Windows you think I’m dumb enough to fall for.
emmm, this was a test on SERVERS, you are aware that there are far more unix/linx/bsd servers in the world that there are Windows ones ?
using your logic, this is actually far worse for Windows, having a smaller target and getting hit harder..
now what was it you said ?
oh yes..
duh
some people around here dont even bother to read the article before they come on here and bitch
“my windows is invunerable to anything”
“my linux will never be taken down ”
“you windows users are dicks”
“linux sucks, it wont run anything ”
etc etc etc
but, having said that… my Windows is FIRMLY locked down
Was this an active or passive test? Meaning, did they just expose the boxes to the internet and wait for anyone to break in, or did they actually have people actively trying to get in?
If the former, then it’s slightly unfair. Last time I checked, there aren’t many automated trojans/worms/etc that are actively scanning networks for holes in Linux boxes, especially vendor-specific stuff.
I’m not trying to argue that Linux unpatched is not secure at all, just that the test is a bit unfair if it was passive and doesn’t say much of anything.
“If the former, then it’s slightly unfair. Last time I checked, there aren’t many automated trojans/worms/etc that are actively scanning networks for holes in Linux boxes, especially vendor-specific stuff.”
This is a good point, but don’t discount the manual scanners. There are people out there that scan entire networks to find *nix machines with security holes so they can install a rootkit. True that it is not automated, but it actually makes it more serious, as there is thinking behind it.
No it is not. By the numbers, there are more Linux/Unix webservers on the internet. Also, there are probably more Linux servers compromised than Windows servers simply because there are so many more of them.
However, when you take percentages into affect, there are a higher number or windows servers compromised. The advent of programming languages that don’t encourage secure programming *cough* php *cough* have opened a whole new slew of problems.
Don’t kid yourself… there are tons of scanners for ‘Nix machines. I’ve seen hacked windows pcs using nitko to scan entire subnets for vunlerable php scripts on Linux machines.
Edit:
http://news.netcraft.com/archives/web_server_survey.html
Edited 2006-03-10 15:20
indeed. and Linux users, by default, think their machines are like Fort Knox.
I myself, am a linux user, and I know that my system is inherantly far more secure than a Windows system, but that does not stop my locking down ipchains and only getting binaries from trusted repositories, or checking MD5 files….
There are bad guys out there !
My logs say there are plenty of automated trojans/worms/etc scanning continuously for holes of any type in any OS.
I can take a 5 years old Linux CD, install it on a machine, disable TCP and UDP services very easily, and then connect it to the Internet either for work or to download patches.
Doing that with an original NT4 or 2000 CD will usually result in a zombie PC before I finish downloading the service pack.
Also, one of the secrets of running Servers is, if it’s doing what it should do, don’t fiddle, you’ll only break it.
BTW “Windows XP Professional, unpatched, lasted one hour and 12 seconds.” was that with or without SP2?
BTW “Windows XP Professional, unpatched, lasted one hour and 12 seconds.” was that with or without SP2?
That question is on the same level of the article
“That question is on the same level of the article ”
The Difference being that the XP SP2 can come on it’s own installable CD, where as, (as far as I know) you can’t get XP SP2 with all the other load of patches on CD, at least not offically.
If the CD that comes with the server is simply not able to be connected to the Interntet to update the patches without being infected with the nasties, then should MS be sending out CD patches out to all registered users? everywhere, in the whole world for free? After all they’ve made a lot more money that Unbuntu have.
No, can’t see that one happending can you?
Time to patch greater than time to hack. It would seem just a better choice of default settings after an install would give it more of a fighting chance and make this more a case of an installer assuming a nice safe friendly environment rather than being brutally raped and ravaged on the internet. It doesn’t say how they were exposed either but I’m willing to say that it was not behind a firewall. The linux box with iptables *is* a firewall. That really was not a fair comparison but “MS Orange to linux banana that has sort of the orange subset of functionality but is so much more”.
For several software packages that we run on our servers over here, the author company only warranties the software if it’s running on a server at a certain patch level. We were two service packs behind continuously until just this year. It that, or run it without any kind of support. We chose to run without the service packs, and firewall the hell out of the servers.
“the author company only warranties the software if it’s running on a server at a certain patch level”
I know, and it’s a really bad issue!
I mean, for many reason is desiderable to have software using shared libraries: less object to load, less object to patch. It’s efficient (you don’t need to store and load different copy of something) and secure (when you patch a component anything using it is secured).
But to guarantee a very strictly defined environment a contained application is better and straighter than any other solution: you know what contains and you know that nothing will update it until you decide to update the application. No possible points of failure.
However, most application are not autocontained, so if a strict compliance to certain patch level is critical the machine should be placed in a very well protected zone (and don’t be updated).
For that reason virtualization (or emulation) software like VMWare, Qemu etc may be difinitely a worthy choice to keep consolidated the hardware embedding more servers into a single, up to date and secured machine.
At this point, in many cases, the virtual servers may be configured to respect very specifical constrains that may even be not security-whise.
However, the latest report also includes a count of bugs found by security researchers that have not been confirmed by Microsoft or the Mozilla Foundation, which owns Mozilla.
This is good. Either company may legitimately not be able to comfirm a bug immediately, even a very severe but. And it’s only natural to hesitate to confirm your product has a bug if you’re not sure. By including bugs found by third parties, they reduce conflicts of interest.
Some of the comments posted here imply that people actually connect a PC/server to the raw internet before hardening or patching? If this is true, you get what you deserve. That is a bad practice and anybody up on security at all is aware that you ahve to do all the hardening and patching before connecting to the raw net.
It’s only a bad practice because of the sorry state of the software in question. We all long for a product that is so well written we don’t have to give any mind to security. (And this article suggests Linux comes closer to that ideal.)
What we all want is a product that does everything for us without our having to do anything at all. But back in reality if you take a known unpatched system susceptible to remote exploits and connect it directly to an unprotected network you’re an idiot. If you didn’t know, then you’re just ignorant, but you if you’re being paid to administer the system in question you’re not qualified for the job. When you know that Windows as shipped N years ago ships with K remotely-exploitable flaws out of the box and connect it directly to an unprotected network, you’re shooting fish in a barrel.
The state of Windows is almost surely worse for the home installation, where the party in question is much less likely to be aware of the risks, and much less adequately prepared to mitigate them. It is perhaps only by the accident of how some people share home Internet connectivity among multiple computers that they aren’t subject to even more trouble than they already are.
But back in reality if you take a known unpatched system susceptible to remote exploits…
That reality is exactly the sorry state to which I referred.
That’s the point, they are kicking naked machines right out of the box onto the internet and letting them fend for themselves.
Studies have shown that if you leave your BMW unlocked in the middle of downtown Compton with $1,000 in the front passenger seat in plain site, somebody is probably going to break into your car.
One could hide the poisonous gas inside the money pack
or… alter your local laws to allow for things South Africans are allowed to do to car thieves hehehehe
http://news.bbc.co.uk/2/hi/africa/232777.stm
Last time I checked, there aren’t many automated trojans/worms/etc that are actively scanning networks for holes in Linux boxes, especially vendor-specific stuff.
Turn up your logging. We get dozens of attempts every day against a variety of *ix-based security holes.
“dozens” isn’t that many to be honest.
Not only that, since *nix is so fragmented, it’s much much more hit and miss when scanning for security holes. Granted, someone doing it manually would probably have a much better chance, but automated is much harder.
“If the former, then it’s slightly unfair. Last time I checked, there aren’t many automated trojans/worms/etc that are actively scanning networks for holes in Linux boxes, especially vendor-specific stuff.”
As a Linux/BSD/Solaris administrator at one of the largest web hosting companies in the US, I can assure you, you have no clue what you are talking about.
I did virtually the same test some time in 2005 in my office. I tested the connectivity of my ISP after a network failure on their side. I put a freshly installed Windows 2000 (no SP) on the net, configured the IP address stuff, opened Internet Explorer to the default first page, and browse the web for a while.
Then…
The machine got infected with a virus before I unplug the network. Within 5 to 10 minutes I think.
I spotted it because I opened regedit after I unplug the network, and I found a malicious entry in HKLM…Run.
“Very nice”, I told the technician of the ISP, “The network is working correctly. Both outbound and inbound connection work now.”
I wrote an internal memo to the president and board of directors at the company I maintain servers at. The VP of operations was complaining about the fact I was taking our servers down monthly and occasionally more often for patching, questioning the need for myself, my team and our efforts since we hadn’t had a major outage in two years.
I believe my exact words in that memo were:
The attitude “If it works don’t fix it” does not fly in the modern IT environment, because there are big nasties out there that will bend you over the table and shove your server where the sun doesn’t shine if you aren’t packing heat – The best protection being fully patched and up to date software.
I actually wish I’d had a article like that to append to my memo. YES, it’s a big giant no brainer – but one you basically have to shove down the suits throats on a regular basis.
The only thing I have issue with is them saying the unpatched *nix equivalents were ‘safe’… Security patching is vital, important and necessary REGARDLESS of what OS you are running – and getting lax due to articles like this saying “The unpatched were fine” is just asking to be… as I put it months ago “Bent over the table.”