“In response to Jeff Jones’ Monthly Security Scorecard I did some research on Secunia and made some statistics to answer his. Jeff’s Scorecard is quite minimal in my opinion and as pointed out by some of the comments, is missing some interesting facts. These facts include the outstanding advisories, for example, and of course the amont of software installed. Since Linux installs a lot more software the numbers are a bit skewed; however, even if I only take the numbers from Secunia with regard to advisories, vulnerabilites fixed, etc., things still look quite different then on Jeff’s charts.”
// These facts include the outstanding advisories, for example, and of course the amont of software installed. Since Linux installs a lot more software the numbers are a bit skewed; //
The same could be said about the install base on the workstation and end-user of Windows verses that of Linux distributions, yet Windows supporters are constantly told (verbally beaten even) that is irrelevant.
Sometimes people just over analyse things.
Edited 2007-03-19 01:31
{ The same could be said about the install base on the workstation and end-user of Windows verses that of Linux distributions }
No. You have it wrong. The opportunity for having a vulnerability varies as to the amount of different code considered, not to the number of installs of the same code.
If I have one 1-million-line program installed on 10 systems, and a different 1-million-line program installed on 10 million systems, I still have the same opportunity for a security hole in each case – somewhere within the 1-million lines there could be a bug that makes the software vulnerable.
The number of systems that can be compromised because of any vulnerability is vastly different, however.
If that were to be true, it’d be impossible to compare systems and their software.
Like I said, over analysing.
{ If that were to be true, it’d be impossible to compare systems and their software. }
Que? Say what ?
How is it impossible to compare “three critical unpatched vulnerabilities in 10 million lines” versus “6 identified but now patched moderate vulnerabilities in 60 million lines”?
I think you have re-discovered the art of the non-sequitur.
http://en.wikipedia.org/wiki/Non_sequitur_%28logic%29
(not that it was ever a lost art, mind you).
I think you are seriously confused between “number of vulnerabilities in a system” and “number of vulnerable systems”.
Different things entirely. Only the first quantity is any measure of “the security of the system”.
Edited 2007-03-19 10:10
yeah but the quantity of bugs multiplied with their criticality and the number of installed systems gives you the potential impact. Windows(TM) is leading there. If IBM would have got some clever students to code its own version of UNIX very probably the world would never have been hit be viruses and worms spreading like ‘shrooms in the first place. Well, whatever, it will be a thing of the past for most users soon, thanks to Vista(‘s price/quality market dynamics) ;=====P
yeah but the quantity of bugs multiplied with their criticality and the number of installed systems gives you the potential impact. Windows(TM) is leading there. If IBM would have got some clever students to code its own version of UNIX very probably the world would never have been hit be viruses and worms spreading like ‘shrooms in the first place. Well, whatever, it will be a thing of the past for most users soon, thanks to Vista(‘s price/quality market dynamics) ;=====P
Thats a rather tall leap to make; sure, the number of installed machines might increase the severity of the security vulnerability as there is a great surface area that will be affected, but it doesn’t make it any more vulnerable.
The people who find the bugs in Windows aren’t hackers but paid Microsoft companies, who analyse the code for Microsoft and give their feedback on possible security threats – then it is up to Microsoft to implement a fix.
The way people look at Windows, it is as though there are thousands of rogue script kiddies out there ‘finding’ bugs, when in reality, the bug is found; the problem is the delay between the bug being known and it being fixed by Microsoft, and once fixed by Microsoft, then getting end users to update their machines in a timely manner; it is a three step process, and it takes time. Believe me, go out today and you’ll find people who still haven’t installed SP2 or any of the required fixes for security problems.
Lets not forget one thing also, Microsoft do have good programmers, the problem is their programme of perpetual backwards compatibility at the expense of security – one manager even went so far as to say, “legacy code is a asset” – it might be a ‘brings in the dollar’ asset, but in terms of security and maintenance, its an absolute pain in the ass – and Microsoft *need* to break that compatibility to first of all heave out the legacy code to shrink the over all code base – reduce the target area, and audit, taking out and replacing bits which could cause security problems, and replace it with clean secure code – with no backwards compatibility.
If it means that a good number of applications don’t work, then so be it – provide Virtual PC along with a free copy of Windows XP/Vista with the new ‘more secure’ operating system, in the form of a ‘ready to run’ image.
Microsoft can do it, its a matter of that they don’t want to do it.
No, I’m not confused in any way. I understand what you’re saying. To suggest however, that it is “understandable” (not quoting you, btw) that there are more security vulnerabilities in Linux because it ships with more software is ignorant of the fact.
I really don’t understand this. Windows supporters get beaten and flamed for an apparent lack of security and greater prevalence of vulnerabilities in the Windows OS, primarily from the Linux supporter crowd, yet here we have someone, you, suggesting that it’s somehow understandable because of increased software shipped with the product.
It cannot be had both ways.
I don’t want to assume you’re a die hard Linux supporter because you don’t seem to suggest that to me. Frankly from experience neither system is secure out of the box, regardless of what software is provided during install.
The fact of the matter is that both systems are insecure out of the box. If that upsets some people, tough – it’s the awful truth of the matter.
As for out of the box security… only a fool would trust Linux or Windows out of the box without patching and hardening.
Edited to remove post from incorrect parent comment. Please ignore / delete post.
Edited 2007-03-19 20:49
Oh my poor OSS zealot a$$hole His ass got so much burned by seeing Linux score poor in last report, he created his own alternate reality to make him feel happy by giving useless arguments like:
Linux has so much more software crap….:)
Edited 2007-03-19 01:48
Do we have an ultra-secure Operating System? No way
That’s all you need to read of that second article. (A lot of bias in it, unfortunately.)
Do we have an ultra-secure Operating System? No way
That’s that not the conclusion, that’s the disclaimer. The conclusion is that the Linux distros issue significantly more advisories that do other OS vendors, but they are also the only vendors that maintain a 100% patch rate.
As for bias, I’m not sure there’s a way to do an unbiased security comparison between OS vendors. They aren’t offering the same functionality, development model, or service model, so apples-to-apples comparisons are virtually impossible.
Security is a cat-and-mouse game when you think of it in terms of sysadmin vs. hacker. But when you introduce the concepts of motive and economics, you realize that hackers do what they do for two primary reasons: the glory of owning a high-profile target or technology, and the monetary value of stealing information or controlling access to it.
In the context of the former, it’s often not the most challenging target or technology they go after, but the weakest. Many hackers go after Windows and other Microsoft technologies simply because they feel that Microsoft isn’t going a good enough job of securing their customers’ assets. They feel that they have a responsibility to the public to expose the insecurity of the platform. It’s one thing to discover corner cases and other easily-overlooked flaws, while it’s entirely another to realize fundamental design flaws that lead to exploit after exploit being discovered. Microsoft has made a respectable effort to change their ways, but it remains to be seen whether this effort will carry over to third-party vendors on the Microsoft platform. As we note from these vulnerability comparisons, OSS vendors take ownership of the flaws in most third-party software, whereas Microsoft doesn’t.
As for the economics of malicious hacking, they do it because it’s easy to drag-net the Internet and come up with copious amounts of potential victims. With millions and millions of computers running the same software with the same flaws, it has become as simple and point-click-own. Not only do we need to decrease exposure time, but we also need to eliminate the hassles that necessitate scheduled patch windows, and more generally, we need computers to be running a larger variety of platforms and software.
I look forward to OS vendors promoting technologies such as virtual machines and HA clustering to bring automatic service failover and easy patch validation/rollout to mainstream IT. Hackers will quickly become frustrated if unexploited virtual machines keep coming up to take control from the one they just exploited! Failover will become the the equivalent of putting a theft-recovery sticker on your car window. Once they see it, they’ll realize that this is going to be more trouble than it’s worth and move on. Resilience, recovery, and first-failure data capture are critical to turning the cat-and-mouse game into more of a Roadrunner-Wiley Coyote sort of situation.
I really don’t see much bias by the author in the article. At the end even he said that the statistics really don’t mean all that much. All that normally ends up being proved by these articles is taht you can use thestatistics to prove pretty much annything you want to.
In theory, Windows is only attacked more, and only successfully attacked more, because of its dominant market share.
But that theory doesn’t hold true to Real Life when you take into account market share of Apache VS. IIS http://www.theregister.co.uk/2004/10/22/linux_v_windows_security/
A lengthy review of the above topic: http://www.theregister.co.uk/security/security_report_windows_vs_li…
And that includes other Myths that are clearly not true.
Further, if you take into account motive, then it quickly swings the other way. Linux systems the source is free for anyone to look at and analyze and develop exploits for. Windows is not. Windows is trail and error until you find an exploit.
And with the worlds biggest companies and largest hosting companies using Linux and BSD to run their systems, it seems like an appealing target. Amazon’s customer information, or at least redirect the sign/up/buy pages to FraudFriendlyCountry of your choice.
There are major systems running the largest services, both financial and informational, running Linux or BSD that have some connectivity to the Internet. There is motive. In fact, there is just as much motive, if not more, to be able to simply walk into these largest systems as apposed to stealing Ma&Pa’s Flower Delivery customer’s info.
But, in real life, it is easier to gain Administrator than Root.
Edited 2007-03-19 02:14
“In theory, Windows is only attacked more, and only successfully attacked more, because of its dominant market share.
But that theory doesn’t hold true to Real Life when you take into account market share of Apache VS. IIS”
—————
I’m not sure what point you were trying to make, but IIS6 has a much better security record than does Apache 2.x. Hell, IIS6’s record is nearly perfect.
IIS6 security record since it was released in 2003:
http://secunia.com/product/1438/?task=statistics
Three vulnerabilities, none rated as “Highly” or “Extremely” critical, and all patched.
Contrast that with Apache 2.x’s record since 2003:
http://secunia.com/product/73/?task=statistics
31 advisories, 3% “highly critical”, 10% unpatched and 3% “partially” patched.
Edited 2007-03-19 02:47
I’m not sure what point you were trying to make”
Actually I was referring to the years of that not being the case.
My point was motive. That there is a very large motive to be able to exploit large Linux and BSD systems.
And Linux gets hacked all the time. Just less than Windows:
http://www.zone-h.org/component/option,com_attacks/Itemid,44/
Yes, I know that these are website hacks, and I know what we are talking about is much more than that.
But there is more to security than exploitable code.
Secure by Default and Secure by Design are as important, if not more important than secure code:
http://en.wikipedia.org/wiki/Secure_by_design
http://en.wikipedia.org/wiki/Secure_by_default
As an example, Windows XP Home sets up two (three if you count System) administrative users, by default, with no way to apply a password to them. Then, after you apply a password to your account, it still hides Administrator from you “to protect you” (from forgetting it’s password), so that almost every XP Home system *at least* has one Administrator account (aptly named so you don’t have to guess) without a password.
This level of insecure defaults and design permeates though Microsoft software.
I set up XP Pro, fully patched (pre IE7), with the MS Shared Computer toolkit, and locked it down with its highest security settings. Leaving only IE accessible for Internet access. I typed “Desktop” into IE and could then create shortcuts to… anything I wanted.
Linux is far from perfect, far from secure especially in code, but it sure as hell does a better job of secure design and secure defaults.
Microsoft is doing better. Much better with Vista, IE7 and IIS6. But there is so much left open, and they still fall short in many areas (UAC for example).
Just an opinion of mine : )
Edited 2007-03-19 03:12
chrono13 – If you are not biased then who else is? You are now just changing your tone because it is proved that Apache has a worst security record as compared to IIS6.
Now you are trying to turn the topic around and then leave your opinion with no example. Where is Microsoft falling short?
You think UAC is bad then what about Ubuntu? They prompt for logged in user’s password? Why can’t a rogue software show me the same prompt and steal my password? IMHO UAC is better than the password prompting approach of Ubuntu.
“chrono13 – If you are not biased then who else is?”
I find neutral opinions scary. Give me two strongly opposing opinions any day from two biased sources than from on “neutral” person or source. I’ll make up my own mind.
“You are now just changing your tone”
No, my point was motive.
“Now you are trying to turn the topic around”
The topic is Windows VS. FOSS security. I haven’t changed that.
“and then leave your opinion with no example.”
I could pull stats from all over the web. Dozens and dozens of security experts claiming UAC as a week half-baked approach at a real security solution. But that is their opinion. I happen to agree with them. I doubt I will convince you by having you read a dozen of their articles as apposed to my ramblings. Even if theirs are written more concisely with examples.
“Where is Microsoft falling short?”
A limited account with three or four dozens “lock downs” and I can bypass it with a single word? That wasn’t an example? UAC has been blasted for its “just in case” elevations that are unnecessary, even for Microsoft’s own software, including Vista itself, triggering it far too often for minor things. That is in addition to all the other alerts, notices and warnings. It is desensitizing and does not solve the issue of least privilege.
Microsoft has continuously and consistently fallen short in Secure by Design, and in by Secure by Default. And it has taken until now for them to even *begin* to reverse these terrible design flaws.
Would Linux be (successfully) attacked as much as Windows if it had the same market share or 50/50 share?
I doubt it. I base this on what facts, bias and knowledge I have, but like everyone else, until it happens, it is all speculation and opinion. I said it was my opinion because you seemed to think I was stating it as irrefutable fact.
Take it with a grain of salt. If you look at exactly how these metrics are measured, you won’t find them exactly even. 70% of statistics lie, and the other 30% are made up. That goes for both sides of the fence.
But that’s my opinion. What’s yours?
chrono, you know, now I actually agree with you in almost all your points. The main point about the statistics that I tried to make was to bring a better view on the whole “fixes released” issue.
Now, about your UAC comment, you are spot on. what I don’t get is when you start Vista there are more pop-ups and notifications and warnings (if you don’t have AV installed) then on a spyware infected Windows XP machines (ok, exaggerated but still..). And it actually becomes annoying after a while. I think spyware will continue to thrive because with that many pop-ups and UAC answer requests, a lot of “basic home users” (read: Joe average, non-techie) will either crap their pants every time it comes up OR, most likely, will get so annoyed with it after a while that they will click OK on anything…
that is my opinion others may vary, restrictions apply etc. etc.
//Flosse
{ You think UAC is bad then what about Ubuntu? They prompt for logged in user’s password? Why can’t a rogue software show me the same prompt and steal my password? IMHO UAC is better than the password prompting approach of Ubuntu. }
How is the “rogue software” going to get execute permissions on a local Linux machine?
It is only the Windows NTFS and FAT filesystems that lack execute permissions. Linux distributions typically will not allow you to have a root filesystem that does not support execute permissions.
It is only the Windows kernel that will happily run an executable file that has not had any local user explicitly give it permission to run.
It is only Windows systems that will allow a rootkit to be installed via the simple act of putting a CD in the drive.
UAC is an exceedingly poor substitute for proper security execute permissions that are lacking in the Windows filesystems.
Edited 2007-03-19 09:27
“UAC is an exceedingly poor substitute for proper security execute permissions that are lacking in the Windows filesystems. ”
Uh, You are partially correct, FAT does not have File permissions, but NTFS has proper permissions, with inheritance and fine-grained control. To get the same sort of permissions on Linux, you must use SELinux. UAC is not there to fix file permissions, it’s there to fix the ever-present problem as everyone running as admin.
Yea NTFS offers file permissions in the form of ACLs. Linux filesystems support secure file permissions but are more sane with only three options: group, user, and read/write/execute permissions.
I use SELinux on the Fedora-based distro I’m running along with sudo/gksudo, NoExecute security, buffer overflow/stack protected GCC, etc.
Except for SUSE, RedHat/Fedora, hardened Gentoo or Debian, most distros are unfortuantly lacking in pre-emptive security department.
I disagree that Linux/Unix’s file permissions are more sane than NTFS’s, I think having finer grained control is important. But that is just my opinion
The finer the grain, the larger the likelihood for error – with that being said, implementing ACL isn’t a simple beast; there was quite a good discussion by FreeBSD developers about the specification itself, and the huge holes that exist in it whilst has resulted in ‘vendor tweaks’ to the specification (and incompatibilities).
Solaris 10, along with trust extensions, has ACL support – hopefully in an upcoming release of JDS, the ACL support in the permissions dialogue of GNOME will be present in the version – which should make life easier for the GUI bound administrators out there.
Technically, they’re not actually running as admin, IIRC they’re running in “Power User” mode, and UAC merely elevates their privilages when required – simpilar to sudo.
I don’t have a problem with the idea, the problem I have is with the way it has been implemented – completely blanking the screen, only have ‘continue’ rather than demanding a password – because its almost a certaintity that an end user will get annoyed and simply become ‘continue button’ happy and when they are faced with a genuine security issue, they would have missed it in the haste of just ‘getting rid of the damn message’.
Oh, I don’t think it’s as good an implementation as sudo, I was just commenting on the fact that NTFS has real, ACLs and file permissions
Yes, there has been a major improvement in Windows/IIS 6.0; but that is due to pressure being put onto Microsoft by competition rather than a genuine desire by Microsoft to do the right thing and secure their software.
But don’t be surprised to see Microsoft going back to its sloppy habits with the launch of Windows 2007 Server – to secure a system, but must be designed from the ground up to be secure; there is no such thing as a ‘bug free’ operating system, but if you design it properly the first time, patching and controlling the damage caused by a security issue is made alot easier.
Microsoft knows exactly what the problems are with Windows, but it would require them to throw out the whole system and start again; yes, there are some good technologies which Microsoft has, but at the same time, alot of it is motivated by a desire to ‘control’ rather than simply to deliver the best product to the customer.
Its been seen over and over again; ODF vs OOXML, and no attempt by Microsoft to sit down with the ODF community and voice their concerns – backwards compatibility? bull crap, I can open up a word document and save it as ODF in OpenOffice.org – does ODF need ‘Word backwards compatibility’ in the ODF specification? of course not, its a rouse by Microsoft to justify re-inventing the wheel.
Passport, another example; “give us all your customer data, and we’ll sell it back to you” – another attempt to jam a ‘solution’ down the throats of customers, and the customers turning around and saying no.
How about Microsoft work on improving their products rather than reinventing the wheel for political purposes which yield no benefits to customers either short or long term.
{ Its been seen over and over again; ODF vs OOXML, and no attempt by Microsoft to sit down with the ODF community and voice their concerns – backwards compatibility? bull crap, I can open up a word document and save it as ODF in OpenOffice.org – does ODF need ‘Word backwards compatibility’ in the ODF specification? of course not, its a rouse by Microsoft to justify re-inventing the wheel. }
In respect of ODF vs OOXML, Microsoft do not need to justify “re-inventing the wheel” so much as they need to justify “utter avoidance of using open standards”.
Wherever there is an existing perfectly adequate open standard, Microsoft avoid it like the plague, lest it become commonly used.
ODF is but one example. Web standards are another, Microsoft Java is another, SVG is another, as is ogg vorbis.
The list is endless. If it is an open standard which any software vendor may use and implement … Microsoft will do their utmost to see that Windows systems don’t support it, so that the standard will hopefully die, and everyone will be forced to use Microsoft’s proprietary alternative, which in turn requires that one runs a Windows platform.
In respect of ODF vs OOXML, Microsoft do not need to justify “re-inventing the wheel” so much as they need to justify “utter avoidance of using open standards”.
First, OOXML is an open standard. Second, MS has been pretty clear that ODF doesn’t handle all of the scenarios that they want to address; hence, ODF isn’t suitable for many applications.
First, OOXML is an open standard. Second, MS has been pretty clear that ODF doesn’t handle all of the scenarios that they want to address; hence, ODF isn’t suitable for many applications.
And every one of those scenario’s, pardon my French, is a load of bullcrap – backwards compatibility, WHY?! open a Word document, save, ODF, and thats it; like I said in the previous post – do you see ODF committee implementing bits and pieces for backwards compatibility of Word?
If you move from one format to another, there is no *NEED* for backwards compatibility – ODF simple? Of course it is, that is the whole point; you take very a very simple specification and build the various parts of the specifications together to get it to do what you want it to do; its like UNIX commmands, by themselves are weak, but put them together vi a pipe command, you can do alot.
Also, Microsoft *had* the opportunity to voice their concerns with the format; they’re part of the OASIS standardising community; Corel got their voice heard, along with IBM as well; why was Microosft playing Nigel No Mate instead of sitting down and saying, “hey, these are some problems that I see with the specification which could severly constrain our ability to provide compatibility between ODF and our products” and then address the issue.
Microsoft *CHOOSES* not to be a team player; they want things done in 5 seconds rather than realising that to come up with a standard, it takes a committee, and if it takes a little longer than a few seconds, then so be it – atleast then after all the parties agree with it, there is support for the standard rather than the current situation with Microsoft who invent a technology then wonder why no one outside Microsoft is willing to use it.
Edited 2007-03-19 23:01
It’s not as simple as saving to ODF. There’s a fair amount of data that would be lost if you were to save some legacy Word docs to ODF.
First, OOXML is an open standard. Second, MS has been pretty clear that ODF doesn’t handle all of the scenarios that they want to address; hence, ODF isn’t suitable for many applications.
Sorry, wrong on both accounts:
First, OOXML is no open standard. There are several keyword definitions in the MS OOXML Format which simply state “behaveLikeWordXYZversion”. OOXML CAN only be fully implemented by ONE vendor: Microsoft. Not what I call open.
Second, The developers of the OOXML <–> ODF converters did try to throw EVERY feature they could find in MS Word onto the ODF format, doing several roudtrips converting the documents back and forth. No feature was lost. This was done by using the eXtensibility of XML. Most features were properly represented already in the standardized ODF namespace. For the few features which were not represented in standard ODF, a new namespace which incorporates these features was defined. So a Microsoft application can read in ALL the features it saved, every other application just reads and represents whatever part of the namespace it understands and can render.
MS has been pretty clearly shown that the only thing stopping them from using ODF as their standard file format always has been their internal monopoly politbureau. MS does not like truly open standards because it weakens their lock-in they currently have at all their products with high market share.
Yeah its actually 33 advisories with three unpatched (9%). Thats pretty good considering Apache is open-source–people can inspect the code–and has 20% more marketshare than IIS6. In 2003-4 Apache had about 30-35% more.
I appreciate the follow up article to the “Monthly Security Scorecard”. If nothing else it’s a blatant reminder as to why I switched to FreeBSD for my desktop boxes over a year ago.
But… holy typos. Come on, pal. At least run it through a spell check.
The relative security of Apache and IIS cannot really be used as a valid arguing point about Windows and Linux security. These are server applications that generally require a (somewhat) knowledgable administrator to set up. Additionally, the designers of such programs have security as probably the foremost object in their minds. Looking at the hack rates for server applications tells you little about how security is going to work on a client machine.
I just spent some time reading the Wikipedia article on setuid/setgid, and the “sticky bit.” These are the primary form of UNIX security that will appear to the user. I think it’s a little silly that the primary means of security that is presented to users is this rather confusing scheme of bits which change meaning depending on which type of object they are attached to. NT-based Windows has had some implementation flaws and arguably a bad choice of defaults (I say arguably, because given intelligent users and a relatively friendly environment running as admin is far more convenient for home users). But the security design of Windows is far more sensible, consistent, and understandable from the perspective of the user. I think UGO and friends is simply an example of “worse is better” and the sooner Linux starts really using POSIX ACLs the better.
{ But the security design of Windows is far more sensible, consistent, and understandable from the perspective of the user. }
Absolutely.
It is far, far easier for users to understand “run anything from anywhere at any time, without regard to security” than it is to understand execute permissions of any kind.
It is also far, far better for a software vendor to be able to run something without permission on YOUR machine … better for the software vendor, that is, of course, not better for YOU.
Also UAC is a better model from a software vendor’s perspective … it is easier that way to then blame the users for the security shortcomings of the software vendor’s software.
Edited 2007-03-19 09:49
This tells me that open source software might have known security holes opened for a few days (often hours) before being patched in CVS and sent to the distros. Not much of problem as long as your box up-to-date.
I think the reason open source has more problems is simple. The development cycle is very fast paced and anyone can examine the code and run security audit testing.
Some think Windows have a good share of security holes… just imagine what might be lurking under Vista’s brand new networking stack and millions of new lines of kernel, subsystem, and driver code.
I’m sure Microsoft runs internal audit testing but thats not enough by a long shot.
At least UAC and Protected Mode IE are doing good job of concealing them from the malicious hacker azzwipes.
“just imagine what might be lurking under Vista’s brand new networking stack and millions of new lines of kernel, subsystem, and driver code.”
That is too scary to even think about.
What I am not surprise of the results ?
I can just explain what happens here:
– What the f–k ? this charts are showing that my favorite OS just suck.
– I can’t tolerate it, I will make another one that will prove that my favorite OS is just the best out there.
Anyway, I don’t see why its charts should be more accurate than the old one or why I should considere Ubuntu as the winner of this charts.
Yes, they have 100% of know issues patched, but it have so more security holes than other OSes …
Duffman, it wasn’t meant as a show off which OS is better ( as i said, I use OS X and its not that great there either), it was meant as a more complete view of Jeff’s. There are just so many factors playing in. And Ubuntu and RedHat aren’t winners. There are no winners, its just interesting to see that, yes there are holes but they are fixed fast. Also, as someone already mentioned, i counted ALL vulnerabilities, these include things that you might not even run (apache?)
Everyone else: as far as bias is concerned, everyone is biased whether we like it or not, but as i mentioned, I tried to stay with the plain numbers and look at the same criteria for all the OS’s.
regarding the speeeeellcheque (:)) I am truly sorry, I ran it quickly now and I will do a more thorough scan when I am at home. The worst ones should be corrected.
Thanks for pointing it out.
//Flosse
I had a similar discussion with a Windows guy over the weekend. His contention was two things. First off Linux security isn’t really there since anybody can see the code i.e. allowing hackers to find exploits before they have been fixed or discovered by people who will work to fix them. The other was that because Linux is “so scattered” you don’t get any uniformity. He was basically talking about package management and dependencies. We’ve all been down that road either on one side or the other so there’s not much point in going there again.
I don’t know what the truth is, but my feelings are like this. Linux, like Unix, has a better fundamental security system than Windows currently. I do think that some Linux distro’s include to many “extras” out of the box (that’s a personal opinion). I also think that as Linux grows it will become a much more attractive target to hackers and will be more readily successfully attacked.
On the other hand I also think that as Linux grows it will also get more people reporting vulnerabilities and patching will still remain a high priority.
As far as how accurate any security comparison between different OS can be I’d say probably not very accurate. I agree that most of the time things are over analyst when giving this data to the general public, but for researchers a lot of analysis is probably required. The facts to me are that any system has potential for a breach. The more programs (or lines of code) you add to the system only increases it’s potential vulnerability. However, the biggest threat to any computer system is still the user. No amount of patching or security will stop users from doing stupid things.
EDIT: I’m just going to add one thing to my comment about the user. Most users lack any real understanding of how to secure a PC (regardless of what OS they are running), but of late I’ve noticed that the developers of our security software have taken to pushing the security of our computers back off onto the end user. I can speak from experience that after awhile I just start ignoring the pop-ups. Security is only as good as the end user and the more security software or measures rely on the end user the more they will fail to secure.
Edited 2007-03-19 15:13
“First off Linux security isn’t really there since anybody can see the code i.e. allowing hackers to find exploits before they have been fixed or discovered by people who will work to fix them.”
The availability of the code may lead a potential criminal to the decision not to do it. If he’s a bit educated about coding complexity, he will soon recognize this.
Some years ago, I just tried to figure out how long it would take to crack a UNIX password until our sun explodes. I found out that it would be possible if the password has a maximum length of 4 characters, granted you had infinite tries and one try would last 1 second. These are assumptions you won’t find in reality, because after the 3rd or 5th try a blocking period would follow. You can easily calculate the time needed by the formula A = t * sum(i=1)(m)|Z|^i, where A is the amount of time needed, t the time for one try, m the maximal password length and Z the alphabet that contains valid password characters.
“The other was that because Linux is “so scattered” you don’t get any uniformity.”
This has been discussed before. Linux is about choice. And about competition.
“He was basically talking about package management and dependencies.”
These are true issues, but they are improving. For example, PC-BSD’s PBI package system avoids any dependency hell, you pay for this by needing more disk space. Forthermore, Linux’s apt-get is a versatile package management tool.
“Linux, like Unix, has a better fundamental security system than Windows currently. I do think that some Linux distro’s include to many “extras” out of the box (that’s a personal opinion). I also think that as Linux grows it will become a much more attractive target to hackers and will be more readily successfully attacked.”
As I mentioned before, this needs extremely heavy actions to be taken. Of course, if you know someones root password, the work is done. Just assume Linux users are a bit more educated than “Windows” users. They know some more about security and won’t insert pirated copies of some strange software into their system. I won’t say it’s impossible to compromise a Linux or UNIX system, because that just isn’t true. But it’s true that it’s much more complicated.
“On the other hand I also think that as Linux grows it will also get more people reporting vulnerabilities and patching will still remain a high priority.”
I agree. I also think this will be the way.
“As far as how accurate any security comparison between different OS can be I’d say probably not very accurate.”
This is primarily because there are applications considered that do not belong to the OS itself. “The OS” just refers to the basic installation of the OS, such as it comes from the CD. For example, if you install “Windows” and right afterwards some malware phoning home coming with some trojans that is able to bypass any security barrier installed after it, you’re done. Same could be said if Linux would come with a password free root account.
“The facts to me are that any system has potential for a breach. The more programs (or lines of code) you add to the system only increases it’s potential vulnerability.”
Please don’t get confused by program functionality and the number of the lines of code. If you have well commented code, there may be 2 times the lines in code that effectively do something. It’s not about the mass of code, it’s just about its quality.
“However, the biggest threat to any computer system is still the user. No amount of patching or security will stop users from doing stupid things.”
This is exactly the point. I would not try to disagree. I just like to mention that Linux makes it harder for the user to do stupid things. Unfortunately, Linux in regards to its goal to gain more access to the home use market will drop some security barriers, but it is assumed that the home user feels more comfortable with it (e. g. autologin, automount, autoexecute). But the usual home user does not care about security anyway.
“Most users lack any real understanding of how to secure a PC (regardless of what OS they are running), […]”
From my experiences here in Germany, I can confirm this. It’s not that the users just don’t know about, they don’t want to know about. They don’t care. They make claims they never can give a proof, such as “I’ve never had any virus!” while their PC is spreading spam across the Internet and is storing pirated software or credit card numbers for a criminal individual.
Linux makes it easy for these users because it’s well configured in regards of security. I hope “Vista” can compete with this, but there are sill lots of unpatched “XP” boxes out there that burden the Internet with their spam.
“[…] but of late I’ve noticed that the developers of our security software have taken to pushing the security of our computers back off onto the end user. I can speak from experience that after awhile I just start ignoring the pop-ups.”
Security warnings would have to be displayed in an always changing way, with nice music, dancing elephants, dogs driving a car, many colors and jumping, flashing, squeaking buttons, with a background video, so the user gets entertained. As soon as the user starts ignoring and setting “Yes to everything”, these means of security, allthough intended to be useful, are completely useless. Furthermore, the user has to be sensibilized that security is not just about him, it’s about others, too.
“Security is only as good as the end user and the more security software or measures rely on the end user the more they will fail to secure. “
That’s a good final statement. It could be added to the statistics as well.
These are true issues, but they are improving. For example, PC-BSD’s PBI package system avoids any dependency hell, you pay for this by needing more disk space. Forthermore, Linux’s apt-get is a versatile package management tool.
Yeap, and the old hatchet of ‘dependency’ hell still lives on – and I’m in complete agreement with you about apt-get; but lets expand it further; if Yum is properly setup, it shouldn’t provide any grief, nor should *.pkg on Solaris if you’re using a package manager/retriever like pkg-get from Blastwave.
“Dependency hell” seems to be the last line of defence for the Windows support base; when in doubt; come up with a half truth and amplify it to the point of equaling to it being ‘the end of the world’.
Its the like the myth, “linux is too hard” – Linux isn’t too hard, end users are too lazy; and I’ll be the first to admit that 9/10, when I had a problem with Solaris, Linux, *BSD, its because I didn’t read the friendly manual. Computers are difficult, operating systems, no matter what they are, are difficult – its just the nature of the beast; if you don’t want to confront that complexity, then do what an author did once, threw out the computer and got a type writer.
I had a similar discussion with a Windows guy over the weekend. His contention was two things. First off Linux security isn’t really there since anybody can see the code i.e. allowing hackers to find exploits before they have been fixed or discovered by people who will work to fix them. The other was that because Linux is “so scattered” you don’t get any uniformity. He was basically talking about package management and dependencies. We’ve all been down that road either on one side or the other so there’s not much point in going there again.
Ah, the ‘security through obscurity’ myth; if I had 10 cents for everytime I heard a Microsoft fanboy scream that one liner to high heaven, I would be a very wealthy man indeed.
To claim that somehow, its easier to read through a million lines of code looking for one small coding error vs. a brute force attack using tools which probe at vulnerable points would be a silly assertion to make.
The insecurity of Windows sits on one basic premise; that end users are dumb, and they don’t update their operating system in a timely manner – even if Microsoft were to release patches before the security problems were publicly known, you would still have the problem of the lag between patch deployment and someone developing exploitative code which takes advantage of those vulnerabilities.
The lack of updating on the end users part can be derived from one of two things; firstly, they’ve been bitten once by Microsoft and fearful if they do apply the patch, all hell could break lose; the other flip side is the ignorance of end users, “I only bought this computer last week, how come it needs updating!” or “I bought this copy of Windows XP yesterday, and it needs updating!?” (yes, there are people that stupid, some how a magical fairy is going to visit the various shops magically updating cds on the shelf) – and unfortunately the likes of Microsoft and Apple prepetuate the myth that ‘computers are easy!” – someone needs to break it to the public; computers are complex devices and you need to know your ‘shit’ before you use it; that is the first line of defence, stop telling end users that you can become a ‘power user’ over night by virtue of the fact that you own a computer – if end users were knowledgeable, half the security problems out there wouldn’t exist.