Last week, security vendor Sophos published a blog post in which it said that Windows 7 was vulnerable to 8 our of 10 of the most common viruses. Microsoft has responded to these test results, which are a classic case of “scare ‘m and they’ll fall in line”.
It’s something politicians are very good at. They will create a threat or a problem, for which onlythey have the solution. Security vendors use this tactic all the time as well, and these Sophos test results are a clear example of that.
Sophos provides no description of the test whatsoever; the only thing they state is that they used a clean, default copy of Windows 7, with no antivirus software installed. They then leave methodology for what it is, and jump directly to the results: 8 out of 10 common viruses work on Windows 7.
What they don’t tell you, however, is how the viruses got on the machine in the first place. Did the users have to perform any action? Did they just connect the machine to the internet and let it get infected all by itself? The latter would be pretty bad, of course, and is reminiscent of the days of Windows XP.
From the wording of the blog post, as well as the lack of details on how the test was performed, it becomes clear that Sophos simply “installed” the viruses to see if they would run. You can hardly call that any sort of a test – as we all know, there’s no patch for the meatsack sitting between the chair and the monitor. Just as much as those Mac viruses which require user action to work are no indication whatsoever that the Mac is insecure, this test proves absolutely nothing either.
Microsoft seems to agree that this is a classic case of a security vendor trying to use sensationalism to sell its own products, but the Redmond company does state that users should always be running antivirus software on their machines. While I’m not a fan of companies sensationalizing findings about Windows 7 in order to sell more of their own software, I nevertheless agree with them that you still need to run anti-virus software on Windows 7,” writes Paul Cooke on the Windows Security Blog.
Cooke further points to Internet Explorer 8’s SmartScreen filter, which the Sophos test obviously disregarded by installing the malware manually. “The SmartScreen Filter was built upon the phishing protection in Internet Explorer 7 and (among other new benefits) adds protection from malware,” Cooke explains, “The SmartScreen Filter will notify you when you attempt to download software that is unsafe – which the SophosLabs methodology totally bypassed in doing their test.”
If there is one type of company which I dislike even more than your average company like Microsoft or Google, it definitely has to be security vendors. They make products that cripple computers, and then try to sell these products by coming up with these pointless tests that prove nothing.
Scare ‘m and they’ll fall in line. Works all the time.
Dont the use sensationalism themselves to sell their own products? Windows has never been a secure platform, but they have improved alot since windows vista.
No, default install is just a lot more insecure than Vista. You have to manually set UAC to always prompt otherwise it is easy to circumvent.
ok i didnt know that, i dont use windows at all. i’ve only tried win7 for a few weeks. But i wasnt really satified to use it as my default os.
This isn’t entirely true. UAC is less secure, definitely – however, the operating system itself also has other new security features. In other words, calling the entire OS less secure is a bit premature.
Doesn’t negate the fact the changes in UAC are braindead.
Well at the end of the day security is account separation, which is effectively dead is the new “streamlined UAC”.
Compared to that other enhancement I read on http://technet.microsoft.com/en-us/library/dd560691.aspx are just minor tweaks or meaningless to consumers, so IMO at least the default install(not the whole OS) is less secure. But defaults matter bigtime when you 94% market share.
And the whole UAC could be avoided if Microsoft refused to support poorly written applications and bundled Windows XP Virtual Machine with every copy of Windows 7. If they did that then the whole malarkey with UAC would be a non-issue. It is end users complaining about their 20 year old application to work perfectly with the latest version of Windows and the vendors who refuse to update their software knowing full well that Microsoft will never force them to make their software run properly in a limited privileged environment.
Each layer of backwards compatibility adding more complexity and possible area that criminals can target. Microsoft could sort it out tomorrow, like I said. They could move backwards compatibility into virtualised Windows XP sessions and hold back Windows certifications from software vendors who refuse to get their software up to standards – the cold hard reality is that when push comes to shove and the difficult decisions need to be made – they crumple.
Edited 2009-11-10 12:05 UTC
As much as I want to believe you, we don’t know if it’s that simple. We talk about backwards compatibility as if it’s a simple package that comes with an InstallShield uninstaller, but in reality we have no idea how entrenched “backwards compatibility” is into the operating system.
They’ve just been fined massively, and forced to change their operating system for something as mundane as including a browser or a media player – how do you think the DOJ and Kroes would respond if Microsoft did something like that?
I’m sure just about every engineer inside Microsoft wants to do just that, but this isn’t Apple we’re talking about – it’s Microsoft. They are treated differently because of their market position, and can’t just do the kind of cut-throat code cutting Apple can do.
Microsoft know where it is, they’ve noted the deprecated parts only there for backwards compatibility, they created the virtualised registry to get around permissions issues on applications which make the assumption that they have administration privileges.
The solution is easy – include in the book, ‘If the application doesn’t run, right click and select ‘run in virtualisation mode’ where by Windows XP fires up in boardless mode (which virtualbox supports) and it’ll appear like any other desktop application but within a virtual machine that is sandboxed off from the rest of the machine.
Based on what evidence. They can still call it compatible but they just can’t get the sticker. That is no different than a person writing a JVM but unable to call it Java till it meets certain specifications. Heck, Microsoft do it already with Windows compatible logo where hardware vendors have to meet a minimum set of requirements before they can affix the logo to their hardware. Making the software vendor meet a certain set of criteria before they can affix the logo of compatibility would be no different than their OEM side of the business.
Mate, there was a manager a while back who said, “legacy code is an asset”; excuse me, but when has a rusted car on the front lawn of a property, without wheels, up on four concrete blocks ever considered an asset? in any other situation it is an eye sore and a source of property depreciation.
When you have managers so far out of touch with reality, so devoid of what technology is actually out there by way of virtualisation, you know the person should be put out to pasture. They had their time in the spot light, time to allow the spot light to shine on those people who aren’t living in the age where COBOL is the the new and up ‘n coming language of choice for business.
Edited 2009-11-10 13:31 UTC
There are a few problems with this approach, among them the performance hit that would come from virtualization (which might be small, but won’t be zero), or the fact that a virtual machine wouldn’t expose the host’s hardware well (in particular, so far as I know, there’s not good, high-performance way to expose the host’s GPU). There’s also the problems that, then you’ve got a lot of still-fundamentally-insecure apps running together in a virtual machine that’s running a guest OS that’s less-secure than the host. If any of those legacy apps manage sensitive information, and the virtual machine gets compromised, then you have a serious problem. There’s also the fact that many insecure, low-level APIs don’t virtualzie well.
Apple did something like this when they moved to OS X: if you had an <= OS 9 application, OS X would try to run the application in what amounted to an emulated OS 9. It didn’t work very well; most legacy apps either didn’t run well, or didn’t run at all, and they didn’t integrate with the rest of the system regardless. I think most Mac users took the hint and wrote off their Mac Classic applications, and used OS X native equivalents if they existed, and did without when equivalents weren’t available. I know that’s what I did.
I’m a fan of virtualization, but it’s not a panacea, and it’s not really a good way to handle any legacy apps on which you’re dependent. At least, not in a desktop environment.
My other concern is that legacy applications and backwards-compatability really are good things. As someone else on this site has elegantly said before, you don’t throw out a code-base with a 20-year track record just because the OS vendor says it’s time to move on.
Virtualisation isn’t meant to be a long term solution – it is only there for backwards compatibility until such time that the customer can upgrade their software to a version that is compatible with the underlying operating system. It is a zimmer frame for applications – that is it. Time for people to wake up and stop expecting software to be perpetually supported on their computer – attitudes expecting perpetual support are as stupid as the person who fills up their car once with petrol and is pissed off when he or she finds out that they need to fill up the tank again.
You buy a car, you need to fill it up with petrol and maintain it. You exist because you have to go out and purchase groceries from the supermarket. You want to run BluRay? get a BluRay drive. Life is a continuous movement forward – stop trying to hold onto the door frame like a child being told that they need to go to the dentist.
And you know, here we are 8 years later, after Apple bit the bullet and they have a top of the line operating system. They made the tough decision when they needed to – Microsoft every release doesn’t want to address the problem. They’re like the obese person who tries every diet under the sun; the pickle diet, the orange diet, the prune diet – all hoping that there is an easy way out instead of facing reality that it is calories in, calories out. Microsoft is like that obese person – avoiding what needs to be done by gravitating around the periphery.
Who the hell said throwing out old code for the sake of it. When the new code addresses all the flaw of the old code and a period of time has been given for programmers to migrate off the old API – you then need to remove it. More code staying in the code base means more area for which a hacker or cracker can aim at.
Yes, keep old code for a period of 5 years to allow customers to migrate off it and address the concerns if the new API lacks certain features developers require – but that isn’t and shouldn’t be an invitation to keep layering multiple API’s from 20 years worth of development. You create an API, 5 years later you realise that assumptions made in that design aren’t meeting the requirements so you create a new API that replaces it. You deprecate it, you remove the ability to compile against it then eventually you remove support from the operating system.
Again, it is pathetic and childish to label what I posted as merely a knee jerk reaction of throwing out 20 year ideas out the window because I feel like it. I’ve laid out reasons for why you should, not just practical but also economic reasons as well. Instead of repeating the same things over and over again – address why what I state can’t and won’t work in reality.
Edited 2009-11-11 06:15 UTC
Lateral and constructive thinking could be used to solve
Virtualization and Legacy Application problems.
Lateral Thinking is concerned with using random words to change concepts.
Constructive thinking places judgements from people down side-by-side. Rather than the old western argument system.
Its time to move away from these age-old problems.
Isn’t this exactly what “Windows XP Mode” for Windows 7 does, using a virtualised XP install running in a headless VirtualPC instance? And it even puts application shortcuts into the Start Menu for these apps.
Edited 2009-11-10 17:45 UTC
Yes it is, but ‘Windows XP Mode’ isn’t included with all versions of Windows 7 – only the highest end. It also doesn’t do away with deprecated parts of the operating system or the work arounds implemented for the sake of backwards compatibility such as registry virtualisation.
Very entrenched. All that should be scrapped for good. The ugly useless stuff should be left to run in a virtual machine with a Windows XP provided copy. They could cut a sizeable chunk of useless crap code out of the OS, not maintain it anymore and live it where it belongs.
Sure they can, as long as they provide a way for existing software to keep running.
I think microsoft deserver all they get. They constantly knock other os and do things like anti-linux training, so they deserve to have their os knocked about (even if the way it is done is a bit of a waste of time.) The fact is that windows has never and will never be a secure os. Unless they really do right back to the drawing board and start again.
And Linux will never let you install new hardware without recompiling your kernel.
Oh wait. We’re both wrong.
The use of outdated criticisms only demonstrates your own technical ignorance.
I apologies for not elaborating. Microsoft has come a long way from XP in securing itself. And windows 7 is far more advanced in its security features, but none the less it is still not secure when comparing to linux for instance.
Until both OS’s are tested in the wild with the same level (and stupidity) of users, along with the same level of focus from the bad guys, this really cannot be stated as fact.
Why? All else aside, the fact of the matter is that the bad guys *don’t* attack non-MS OSes with anywhere near the intensity that they attack Windows. This makes Windows a far more dangerous operating system to run. Period.
Look at it this way. If you had a choice of being put into a battle zone without a bullet proof vest, being put into a battle zone with a bullet proof vest, or staying at home watching Nova (with or without a vest), which would you choose? Which would be safest?
I’ve never understood folks who whine that if Operating System Q were attacked as much as Windows, they would have problems, too. Because there is only one family of OSes which *is* attacked so violently and consistently. And that is the Windows family of operating systems.
It reminds me a bit of that scene in “Whatever Happened to Baby Jane?”.
To paraphrase:
—
Blanche: If *only* I weren’t always getting attacked by all this malware!
Jane: Butcha *are*, Blanche! Ya *are* getting attacked by all that malware!
—
People need to learn to face reality. And the reality is that regardless of the relative security features of the OSes themselves, Windows is a far more dangerous OS to be running than just about anything else, because it’s the one with the target painted on its back.
Edited 2009-11-10 22:51 UTC
This argument is getting very tired indeed.
Firstly, Linux has significant market share in areas where it is an attractive target … servers for example.
Secondly, in Linux the “paradigm” for installing new software is not to download & run stuff from some random website, but rather to use a package manager.
I believe package managers have an impeccable record.
Over many years, for thousands of packages, for many Linux distributions, for millions of users, I have never heard of a single case, ever, of an end-user’s system being compromised with malware through installing software using a package manager.
Amongst many millions of Linux users, there has got to be the odd stupid one here and there you would think.
Sigh. Linux server != Linux desktop. Servers are locked-down far more than desktops. You can’t extrapolate one from the other. Apples and oranges. Once you start opening up ports to run things like BitTorrent, web browsers, etc, the attack vectors become multiplicative.
Um, that works fine if you only run open source software, but there are MANY cases where no open source application exists for what you want to do. So, what does a user do? Fail? I don’t think so.
So what. There have been cases where repositories have been compromised. Only dumb luck prevented you from getting screwed by a malicious attack.
http://www.eweek.com/c/a/Security/Security-Web-Digest-Major-Open-So…
Millions? Talk about overly optimistic…
Edited 2009-11-11 01:36 UTC
Nevertheless, the argument that “Linux is not an attractive target” is utterly debunked by the number of Linux servers.
No, you just don’t think.
The package managers and repositories do not require that applications they contain be open source. There are binary-only repositories which allow for distribution of closed-source applications via package managers.
Being closed source means that such applications are not auditable, but that does not mean they necessarily contain malware. They can still benefit from the secure delivery channel to end-users systems offered by package managers.
As an example, Adobe’s flash player for Ubuntu is deliverd by package managers. Ubuntu has a “third party repository” to provide for just this kind of distribution.
https://help.ubuntu.com/community/Repositories/Ubuntu#Third-Party~*~…
“The “Third-Party Software” tab is where you will be able to add the Canonical Partner Repositories. You will see two Canonical Partner repositories listed – one for applications and another for source code (src). The partner repositories offer access to proprietary and closed-source software and are not enabled by default. Users must specifically enable these ‘partner’ repositories. Select “Close” and “Reload” to save and update the database if you chose to add either or both of them.”
This is an incident where a GNU server was hacked. Broken in to. No system is invulnerable to a hack where a password is either guessed or illegally obtained. No malicious code was injected on to the server. No end users systems were compromised.
Pfft.
http://www.desktoplinux.com/news/NS5114054156.html
“Eric Lai quotes ABI analyst Jeff Orr as saying that the study shows that 32 percent (about 11 million netbooks) of this year’s netbook shipments will be used with a Linux-based operating system. “
There is 11 million desktop Linux systems right there, in one small section of the market, in just one year.
The fact that for thousands of packages, for many, many millions of users, over many years, the one incident that you came up with resulted in no end-users systems being compromised rather proves the point, doesn’t it, about the relative security of Linux desktop software distribution compared to Windows?
Thankyou for illustrating it so nicely.
Edited 2009-11-11 02:05 UTC
BS. Those servers are running a paltry number of services and are locked-down tighter than a nun’s thighs. Those kinds of environments aren’t as attractive as desktops because the cost of finding and exploiting a vulnerability is considerably more difficult.
Again, it provides no independent means of auditing, which debunks your claim about package managers being safer. They’re merely another distribution channel.
So much for your “secure” claim.
And, naturally, ABI doesn’t offer any details to back up its claims on what MIGHT happen in the future.
Sigh! This depends ENTIRELY on what you mean by “attractive”. For your meaning above, you are correct, but that is not what was meant by “attractive” in the original context of the argument.
In its original context, which was “Linux systems aren’t attractive targets for malware” … the word “attractive” actually means what might be gained by the balckhats by getting their malware onto the target systems. In that context, servers are a lot more attractive than desktops, as they generally hold a lot more valuable information.
When you add closed-source repositories, yes, you kind-of have a point (I have made another post about this). They are indeed then merely another distribution channel … a safer-than-anything-on-Windows distribution channel with an impeccable record to date.
How so? Elaborate please?
PS: No system is invulnerable to hacking via knowing the password. None at all.
However, if any attempt was made to put a malware binary onto a GNU repository server: it would show up in the server logs; it would be auditable that it had happend by comparison to source; and there would have been an enormous hoo-ha made over it.
Once again, the reality about repositories and package managers is … impeccable record. Impeccable.
So? ABI’s predictions for the future are based on what they measure in the real world today.
BTW: Dell says that it sells one third of netbooks with Linux:
http://blog.laptopmag.com/one-third-of-dell-inspiron-mini-9s-sold-r…
Edited 2009-11-11 04:26 UTC
That is so NOT true. Desktops are far more attractive targets for malware now because (a) they’re more readily exploitable, (b) blackhats can create and sell bot-nets composed of exploited desktop machines to spammers for a ton of cash, (c) desktops generally don’t keep network logs (which makes covering their tracks easier). That said, if you can find a zero-day exploit in something like SSH or SSL or a popular network daemon running on a server, that’s VERY attractive. But since it’s so difficult to achieve, desktops are even more attractive, based on technical difficulty.
But, again, this doesn’t provide me (as a user) any reassurance that the closed-source software doesn’t have a timebomb or, worse, some kind of exploitable problem.
It’s simple: Stop pretending that repositories are magical and “secure”; hence, packages aren’t any more secure.
Show me numbers. For all we know, these numbers could have been cooked for a boutique customer. You guys should be familiar with this line of reasoning: You claim this about IDC and other “research” firms all the time regarding Microsoft and anyone else you oppose.
Right. Less than 5% of netbooks in the US. Nice.
Windows desktops are attractive, sure. There are a lot of them, and they are easy to compromise. Very attractive, no doubt.
In other sense, based not on technical difficulty but on the value of compromising the target, Linux servers are attractive targets. Technically very difficult targets, sure, but nevertheless very attractive.
Sure, if you deliberately add closed-source repositories, you are knowingly taking a risk (because of the closed source) but hey here is your still-remaining assurance anyway:
Of course they are. What part of “IMPECCABLE RECORD” did you fail to understand?
The numbers are world-wide (not just the US). World-wide numbers are very hard to “cook with a botique customer”. This is why the nunbers DON’T show a huge percentage of Windows … the numbers aren’t cooked.
I’ve often thought that Americans have no sense at all. Completely bonkers, wouldn’t know a good deal if it bit them on the arse. Phineas Barnum was apparently right … for Americans. You are helping in a big way to enhance that kind of image.
Hold on a sec. You had argued that Linux desktop security was somehow demonstrated by the existence of a large number of secure Linux servers and, as I pointed out, that’s a BS extrapolation. You can’t extrapolate desktop security from Linux servers: They’re locked-down tight, they run as few services and open as few ports as possible, they generally have no UI, etc. In other words, the server usage profile is NOTHING like that of the average desktop computer. NOTHING. Servers don’t use BitTorrent. Servers don’t use media players. Servers don’t download unknown binaries and execute them. Servers don’t read user email. Enough said. If you can’t understand the distinction, you can’t be helped.
The record is NOT impeccable. I just showed you an example of a compromised repository. It was plain dumb luck that you didn’t get 0wn3d. That does not translate to impeccability.
ABI hasn’t released the source of their numbers. As I said, show me the source.
It’s actually quite simple. Americans don’t consider saving $50 on a copy of Windows to be a good return on investment, when you have to struggle with a dearth of quality commercial (aka non-crap) applications and wasted time spent in console sessions…
Edited 2009-11-12 01:34 UTC
Meh. Your point above is valid, but irrelevant to the discussion. All of what you say above, while valid, does not mean that Linux is not an attractive target. Linux on servers is an attractive target because of the significant number of them, and the fact that they hold generally more valuable data than individual desktops.
I actually argued that the fact that Linux is an attractive target is demonstrated by the number of Linux servers. You have a disconnect in your counter-argument.
Yes it is. There have been no known instances of malware getting installed on an end users system via repositories and packages managers. Impeccable record.
Which is not an end users system. The compromise itself was the result of a hack, it has nothing to do with the repository/package manager system itself. The malware trojan itself did not get on to the compromised system via another package manager/repository.
Says you. You do not address the point that no software on the GNU servers (that might subsequently have propogated to end users) was changed. The only thing that happened was a trojan on the server OS itself, which isn’t a compromise of the content of the repository that it served.
Any change to the actual GNU software repository contents would have shown up in the SVN logs (or whatever they used).
Still, regardless of your shrill and increasingly desperate protests, there is the fact of the actual impeccable record that you have not even come close to refuting.
Sigh! Linux desktop applications are nearly always better quality than expensive Windows equivalents. Linux desktops themselves are certainly much faster on the same hardware, and far far less vulnerable.
There is an old saying that applies very well here I feel … “you can lead a horse to water, but you cannot make it drink”.
BTW … you can run all of the desktop software on Linux desktops these days without ever once opening a console session. You really need to keep up with the times if you are ever going to become a more effective Microsoft shill.
Finally, it is not a matter of just $50. Getting malware because of running Windows can cost many millions.
http://windowsitpro.com/Articles/Print.cfm?ArticleID=40473
Edited 2009-11-12 02:17 UTC
http://windowsitpro.com/Articles/Print.cfm?ArticleID=40473
BTW, the reason why the link above is relevant to this topic is tied up with the first quote above. Linux desktops do provide a mechanism, a simple-to-follow process, whereby the system will stay malware-free. This has proven in the past to be 100% effective. One simply has to stick to a self-imposed policy as follows: (a) always install software via the package manager, and (b) never enter your password except at login and in the package manager.
There is no equivalent process or policy that one can follow when using Windows. Using Windows will leave you vulnerable to malware. This could potentially cost you a great deal of money.
Would it be possible for you to reference articles that aren’t 6 years old?
Nonsense. If a user wants to install an application (regardless of whether they’ve been warned not to), whether they click Yes to UAC or enter a root password to install the software under Linux, it’s going to bite them.
You really are disconnected from reality, aren’t you? Customers aren’t going to restrict themselves to the package manager. They are going to install applications from all kinds of places. That’s reality. We’ve already covered this ground. You can’t audit closed-source binaries through the package manager, and we know that the package manager won’t be sufficient for users; HENCE, you’re going to have the same kinds of malware problems under Linux. Any reasonable person can see that, and denying it can only be explained as intentional deception on your part.
Nonsense. To use your logic, users can only download software via Microsoft or Amazon. Problem solved.
Edited 2009-11-12 04:21 UTC
Sorry, I didn’t check the date.
Here is a more recent reference for you.
http://www.newsweek.com/id/208652
Should Software Firms Be Liable for Malware?
It isn’t nonsense, we’ve been through this. Impeccable record, remember?
Not the point. If the user has available a means of keeping their system malware free, and they choose not to follow that, then it is the user’s responsibility.
But for it to really be the user’s responsibility, the software vendor has to provide a reasonable means of staying malware free that can actually work. If there is no such means available at all, and a user gets malware and loses money as a result, then there is a case to make that the supplier of the insecure software system should be liable.
But then it is not the software supplier’s fault. The software supplier need only point out that “We provided a secure means of updating and install new applications, and the user didn’t follow it”.
But we don’t have the same kinds of malware problems under Linux. Impeccable record, remember?
Pfft. Why do you continue to pretend that you have shown a malware breach of the package manager/repository method of update & distribution when you actually haven’t? I don’t see the point in your intentional deception?
That won’t leave you free of malware. Not by a long shot. WGA is just one (admittedly toungue-in-cheek) example.
It doesn’t matter. If end customers don’t stick to the proven-safe package managers, that is the customer’s decision. The default set of packages installed plus the packages in repositories for Ubuntu number over 25,000, so it isn’t like you don’t have a fully functional if you just stick to those packages. The software vendor did provide a safe (and functionally comprehensive) method of distribution that the customer chose not to stick to.
Errrr, the Linux software vendor, that is. Microsoft haven’t provided anything like it in Windows. Your Windows machine can pick up malware no matter what policies you stick to.
Edited 2009-11-12 05:29 UTC
You clearly don’t get this simple concept: Users don’t understand the implications of what they’re doing; they want to run a piece of software, and they will do whatever it takes — clicking Yes on UAC or entering a root password in Linux. Blaming them or criticizing them or saying they should be running package-based software exclusively doesn’t change the fact that they don’t know what they’re doing. If these same users were running on Linux, the result with malware would be IDENTICAL. Their machines would quickly turn into piles of flaming malware crap. And that has nothing to do with Windows. It’s based on user ignorance.
I never blamed the software supplier. The problem is user ignorance and human nature, not technical issues.
You have negligible desktop market share; hence…
You keep claiming that package manager/repository-based distribution is secure. It’s no more secure than any other server based distribution method and, as I pointed out, repositories have been hacked in the past.
It’s no less (and no more) secure than package manager/repository based distribution.
See above. Most customers do not have the capacity to make an informed decision when installing software. And they will download anything that they like.
The same is true of Linux. When the user installs software of unknown origin, same result.
Edited 2009-11-12 07:42 UTC
Why do you continue to pretend that you have shown a malware breach of the package manager/repository method of update & distribution when you actually haven’t? I STILL don’t see the point in your intentional deception?
Sigh!
Hacking a system is gaining access via obtaining or guessing a password, it is not a breach of the repository for software distribution. The fact that a hack once happened on a repository server is neither here nor there, even with such a breach and a presumably hostile hacker with root access on the server machine, still no compromise of the repository/package manager system itself occurred. It is just too difficult to craft the required malware SOURCE code and inject it undetected into the repository system.
PS: Injecting binary code is of no use. The very next re-compile (which probably runs nightly) will over-write the malware you attempted to inject.
So no end user’s machine was in any way compromised, even though a hacker had full access to the repository server.
The package manager/repository system is proven secure. The record shows it. Even where presumably malicious agents had an opportunity to corrupt it they could find no way to do it that would not be immediately discovered. It has an impeccable record.
The package manager/repository system is the default for Linux desktop machines. By default, all of the software it offers is audit-able by anyone at all. Access to the system is on the top-level menu, and it is exceedingly easy to use. It offers a vast (and safe) array of additional zero-cost software, far more than an ordinary Windows user could ever hope to find in years of careful searching.
What exactly is your point in trying to pretend otherwise?
Edited 2009-11-12 11:11 UTC
Either you’re completely and utterly in denial, or you’re intentionally full of shite. Either way, I’m done with this conversation. Anyone who’s still reading can see for themselves how “secure” your ecosystem is…
Attacks on Package Managers
http://www.cs.arizona.edu/stork/packagemanagersecurity/attacks-on-p…
package managers still vulnerable: how to protect your systems
http://www.usenix.org/publications/login/2009-02/openpdfs/samuel.pd…
How strange … after loudly beating a certain drum for ages, and being shown as incorrect, you now try a completely different tack. Aren’t you even a tiny bit embarrased, have you no shame? One would think that it was of utmost importance to you to try to convince everyone that that using the package management system is somehow bad.
Yet it has an impeccable record.
It is great however that security researchers worry that the system may have points via which it can be attacked, so that such potetnial weaknesses may be addressed well in advance of anyone finding a way of exploiting them.
Comapre this to the situation with Windows, where the utter lack of a decent way to manage software installation/removal and prevent malware from being installed has been a serious, fundamental and completely un-addressed problem for decades.
Edited 2009-11-12 22:18 UTC
That’s hilarious: “We’re secure because, although somebody hacked our servers, nobody downloaded infected software from the hacked repository.” You’re quite a comedian.
You’re the one arguing for the so-called supremacy of package managers & repositories. A chain is only as strong as its weakest link. If you knew anything about security, you’d realize that and demonstrate a bit more humility.
As I said, the only thing that prevented you from getting 0wn3d was dumb luck. If somebody had really intended to do harm, they could have done it easily.
This hack was done far in advance of anybody actually detecting the break-in. Nobody was aware for months.
Most people would simply acknowledge the hacked repository as a serious flaw. They wouldn’t argue against reason that the system is “impeccable”. Are you really so desperate for people to try Linux that you’d sacrifice your credibility? Why? It’s ridiculous.
Oh, right. It’s why all the Photoshop and Autodesk and Office and Quicken users are just FLOCKING to use open-source alternatives to those commercial products…. oh, wait … they’re not. Ah, I get it: They’re just “dumb users”, right? They “don’t know better”, huh? LMFAO! Sorry, but in this case, there is a wisdom in crowds that refuses to embrace your BS hyperbole.
Linux distributions tend to take a very proactive approach to security and they provide a safe environment as long as you follow the rules. But so does Windows. The difference is that the majority of Linux users tend to be technically competent while the inverse is true for Windows users. Linux users follow the rules, or at least, have enough knowledge to know how to break them properly.
These days, the security problem is a social one– not a technical one. Take the ~90% share of Windows users and dump them all on Linux tomorrow and you’ll see the same problems emerge in that environment. A good majority of these people will run any binary, click any button, enter any password, and happily copy/paste chmod +x whatever; ./whatever into a terminal if that’s what it takes to make something happen.
From a technical point of view, the major operating systems are all practically equivalent when it comes to security. None of Linux, Windows or OS X are attractive targets. The user is the attractive target. The user is the attack vector.
Arguing anything else at this point indicates either intent to deceive or a severe lack of understanding.
Windows users so don’t understand the Linux ecosystem.
Where are users going to get such instructrions to follow that won’t be plastered all over the net as a no-no?
Perhaps here?
http://www.linuxgenuineadvantage.org/
Watch out though, that has be cracked!
http://www.alienos.com/articles/2007/02/02/linux-genuine-advantage-…
ROFLMAO.
Fortunately there is a tremendously simple applet, call “Ubuntu Software Store”, right there on the main menu. You can’t miss it.
http://osrevolution.com/os-misc/ubuntu-software-store-screenshots
This is so simple, and vastly easier to anything on Windows, so why would users do anything else? “Get Free Software” and thousands of choices … how good is that?
The way to add software on a Linux system is totally different to Windows:
https://help.ubuntu.com/9.10/add-applications/C/installation-windows…
https://help.ubuntu.com/9.10/add-applications/C/index.html
This is the first thing you have to learn.
For Linux, this is the system you have to attack. It is much harder to trojan a Linux system because you have to get past the normal way to install software.
Just remember: “impeccable record”. Mull over it for a while. See if you can figure out what it actually means, from a practical point of view.
Edited 2009-11-11 06:05 UTC
Close. Most Windows users don’t understand any computing ecosystem at all. That is my point.
Your argument is based on the assumption that users switching to Linux will follow the rules of responsibility for that platform even though they blatantly disregard similar rules on Windows.
Sorry. There is no logic in which that assumption holds true.
Software repositories are a fine approach and they can be convenient, but unless they are provided as the only method of installing software, they are not a solution to the security problem.
Take the iPhone for example. As long as you stick with the official app store, the iPhone is a secure device. Those who have jailbroken their phones to install software outside of the officially sanctioned method, however, are now facing security issues.
More or less, that’s pretty much what I was trying to convey.
Just on this … this is also an oft-touted claim, but it has no credibility without justification.
Games is one area where this perhaps has some semblance of validity, but if you want to play games why not just buy a games console?
As for other, real-world actual desktop applications … I’d like to hear of one with wide adoption (say over 80% of desktop users would run applications of that kind) where one couldn’t get good software for Linux to achieve that end.
I’m talking email clients, browsers, Office suites, editors, collection managers etc, etc … exactly what kind of software do you imagine one can’t you get for Linux?
Edited 2009-11-11 02:27 UTC
Way off topic, but this is rather ironic coming from you considering that one of the main advantages to PC gaming is that developers often release tools/SDKs that allow you to modify their games and share your work with others. Whereas consoles are about as locked down and DRM ridden as you can get.
I’ve seen this line of reasoning often enough to learn that supporting DRM and impenetrable devices, particularly for gaming, is just fine for some in the Free Software crowd as long as it serves to devalue one of the true advantages that Windows has over Linux.
Well, allow me to retort…
Ah, yes. That old familiar kneejerk response from a Linux fanboy upon discovering Use-Cases that they can’t handle: Criticize the user. Nice. How’s that working for you? Converting lots of “dumb, ignorant users” with that approach?
Um, sorry, but you don’t get to narrow the scenarios to some arbitrary percentage of users in order to deflect the damage. Users have all kinds of different needs — and in fact, needs that are already being met by OS X and Windows — so you’re going to have to try harder to pretend all they need is a web browser and an Office suite.
Mac/Windows…………Linux
Photoshop……………..GIMP (crap)
Quicken…………………Zilch
Autocad…………………Zilch
PageMaker……………..Zilch
Visio……………………Zilch
Access………………….Zilch
AfterEffects…………….Zilch
3DStudio MAX……………Zilch
A zillion vertical apps…Zilch
Edited 2009-11-11 04:19 UTC
Those are specific apps, not kinds of apps.
Photoshop is a hugely expensive application that wouldn’t be of use to more than 1% of the population. GIMP is more than good enough for the other 99%.
Quicken … Moneydance.
Access … OpenOffice base, firebird
Visio … Dia, OpenOffice draw
The rest also have a low user base, but anyway:
Autocad … http://www.linuxlinks.com/Software/Graphics/CAD/
Pagemaker … Scribus
AfterEffects … cinefx, kdenlive
3DStudio MAX … Blender
A zillion vertical apps … none of which are used by 99% of people.
OK, so therefore you have shown that Linux is not the best choice yet for 1% of the desktop userbase.
You are just a tad shy of the 80% level-of-use applications I asked you about.
Edited 2009-11-11 04:44 UTC
Sorry you may know absolutely nothing about CAD programs, but not a single one of those would be of use. My father happens to be a structural engineer, and believe me when I say AutoCAD IS THE industry standard. Beyond that, Visio just does not have a competitor. Cue up Adobe Premiere, the capabilities that these programs offer are just nowhere to be found elsewhere…period. As for Access, there again comes an issue of what the competition has to offer, and what they offer simply is not sufficient.
Maybe for some kids who just want to play around, some of this free stuff will suffice. But you try to be a consultant and do some work in those Linux CAD programs, you will NOT find work. Ditto if you want professional or quality video production. This where I find you zealots to be just close to dangerous. You care more about your own personal obsessions than anything else.
http://www.varicad.com/en/home/
http://www.varicad.com/en/home/products/description/
Visio is very good software.
In Linux, Dia, Inkscape, Uniconvertor and sk1 can all interoperate, and they can replicate a good portion of the functionality of Visio between them. None of them are as capable as Visio, and Linux cannot handle the (deliberately) obscure .vsd file format, but quite a lot of Visio functionality is covered anyway.
You can certainly create a Visio-quality diagram however. Certain types of diagram, such as UML, are no problem at all. Admittedly, the shape library is nowhere near as extensive as Visio, but it is growing.
I work in an Engineering firm. Visio is a great tool, but it is very expensive for what it does. It is good for producing design sketches and diagrams. You cannot create production engineering drawings with it. At $500 bucks a seat, it is horrendously expensive for what it does. Because it is in use only a small percentage of any user’s time, Visio more often than not will be available only at a small percentage of seats.
If all you are doing is creating design sketches, use Dia or even Inkscape or OpenOffice draw.
http://en.wikipedia.org/wiki/OpenOffice.org_Draw
http://en.wikipedia.org/wiki/Dia_%28software%29
http://en.wikipedia.org/wiki/Inkscape
Just as good a result at a tiny, tiny fraction of the price.
That may be so … but it is a consideration for ONLY a very tiny percentage of desktop users.
You are going to have to try to support that a bit better than just mere assertion.
http://wiki.services.openoffice.org/wiki/Documentation/Database
Warning: the next link is a 62-page PDF (beginner’s guide).
http://documentation.openoffice.org/manuals/oooauthors2/0110GS-Gett…
Lots of functionality there. As a bonus, when you store your data, you are not forever-more locked in to an expensive platform and Office suite to access it.
Now you are just being ridiculous. You saying something does not make it so.
You say this, yet VariCAD and the like still exist, as they have for 11 years now, and are still making money.
http://en.wikipedia.org/wiki/VariCAD
VariCAD will cost you a lot less than AutoCAD and yet still process the same files.
http://www.varicad.com/en/home/
There are a very few odd vertical markets for which the only real solution is Windows. This does not affect 99% of users, however.
Pfft. Says you.
Edited 2009-11-11 11:09 UTC
I suppose if you’re up to your hips in shite, you do what you can: You shovel. But you’re in the unenviable position of trying to match round pegs with square holes, and I don’t think you have the slightest clue why most professionals that require the apps cited previously won’t move to Linux. Ever. The basic problem is that, when your competitors are running industry standard apps, those same apps do exactly what they want, and they do it exceedingly well when you’re offering crappy alternatives, you’re screwed. Oh sure, you can try to (a) criticize the user, or (b) cherrypick some small subset of user scenarios and try to match a patchwork quilt of lame technologies to their needs. Kind of like throwing shite gainst the wall, and hoping that it sticks. But it’s not going to convince anyone other than fellow Linux fanboys. Who you already had onboard your delusion train.
Edited 2009-11-11 23:41 UTC
http://www.lautman.net/mark/coo/index.html
Warning, then next link is a very large (6.5 MB) PDF file:
http://documentation.openoffice.org/manuals/oooauthors2/0400DG-Draw…
If you have bandwidth, and are curious, Chapter 9 and 10 of the OpenOffice Draw guide, in conjunction with the Custom OO Shapes, will illustrate the Visio-like capabilities of OpenOffice Draw.
Not as good as Visio, but now consider the price:
http://en.wikipedia.org/wiki/Comparison_of_vector_graphics_editors
Microsoft Visio: $560
OpenOffice.org: Free (Open Source)
That is a huge difference. Considering that it is normally only sparse usage, it is very, very hard to justify a Visio seat at that price/value trade-off.
Mac/Windows…..Linux
Photoshop…………..GIMP, Krita
Quicken…………….Gnucash
Autocad…………….PRO/E, Catia, …
PageMaker…………..Inkscape, Scribus
Visio………………OODraw, Umbrello
Access……………..OOBase + Database
AfterEffects………..Houdini
3DStudio MAX………..blender
Zilch………………Computing on 4 of 4 cores + working in desktop app
Of course, if you have experience with the apps you use on Windows which are not available on Linux, you will experience quite some disappointments. And most likely you will not switch.
As did I when I was switched from HPUX and Linux over to Windows at my workplace, and found out how hard it can be to find a decently working editor in the Windows world. In the end I left the company.
This is quite unfair because many of those applications exist already. What u say following this quote is unattributable to tangible uses.
I would like to see back-of-book Indexing
software for Linux.
Macrex runs on another Unix, but is really for geeks, not Indexers. Cindex is the best, runs on Windows, uses a database creation layout.
There are many Open Source books.
This software is not for geeks, there are Indexing courses around the world; cheapest in United States.
Edited 2009-11-11 11:11 UTC
http://en.wikipedia.org/wiki/LyX
http://www.lyx.org/
http://wiki.lyx.org/Tips/Indexing
http://www.lyx.org/Features
For reference, LyX is effectively a GUI front-end for LaTeX
http://www.latex-project.org/
Very powerful, professional document preparation software.
http://www.latex-project.org/intro.html
http://en.wikipedia.org/wiki/WYSIWYM
Edited 2009-11-11 12:21 UTC
Your just thinking about Keyword Indexing like in a Word Processor or Google search engine. This is insufficient.
Do I need to spell out why there are specific software for these tasks for incompetent d*ckhe*ds like u?
LyX indexing is intended to produce an index for a non-fiction book.
http://www.linux.com/archive/feature/56471
So other than creating a book index, I don’t know what you mean. Yes you need to explain why LyX support for creating book indexes is insufficient compared to other un-named “specific software for these tasks”.
You need to point out why specific support in Lyx for creating a book index isn’t also “specific software for these tasks”, when it would seem that that is exactly what it is.
Edited 2009-11-12 13:15 UTC
and the FSF doesn’t knock other OS’s? Or Apple? Grow up.
Chicken and egg.
If Microsoft trains representatives to lie with anti-Linux FUD, it has to surely expect criticism in return.
http://www.linuxpromagazine.com/Online/News/New-Anti-Linux-Propagan…
I mean, really:
http://quaoar.ww7.be/ms_fud_of_the_year/569458-microsoft-attack-lin…
outright lies, pure and simple. Caught red-handed just plain lying.
As usual, Microsoft’s “Get the Facts” campaign spreads totally unsubstantiated lies about Linux which it calls fact.
…
remarkable is Microsoft’s claim that in the case of a security leak, Linux offers no guarantee of a patch- ignoring the fact that in the past, critical breaches in Linux have never been left for any notable length of time without a security patch being released. Unlike Windows, where a known security issue can stay un-patched for two years. Which shows that it’s Microsoft that should be reticent of offering guarantees for patches.
Microsoft’s biggest porkies are about the security of its OS in comparison to others, as usual.
Edited 2009-11-10 12:52 UTC
Agreed!
Anti-linux fud has nothing to do with the misrepresentations that Apple employs in it’s adds, and most of the problems listed on the Windows 7 Sins page is just FUD, or problems that were solved ages ago.
I’m pretty sure that the FSF predates Linux, so they have been spreading the word long before MS started to get worried about Linux
MS is not the only one that lies, and as a user of both Windows and Linux, i can tell you that the FUD from both sides is kinda sickening.
There is no lie in saying that Linux isn’t guaranteed a patch for a flaw. There is no one company behind it, to ensure that flaws will eventually be patched.
As for 2 years, I guess you forget the OpenSSL weak key flaw that was a bug from mid-2006 until mid-2008 huh?
Edited 2009-11-11 21:30 UTC
This is true. I suppose then there are only the estimated 1.5 million full-time-equivalent developers involved with open source, who can all see the code and submit patches against identified problems, and whose best interest is undoubtedly served by promptly fixing any identified security problem.
An as-yet-unidentified bug is not an unpatched security flaw. It is a bug.
An unpatched security flaw happens when a secruity bug is know to the general public, but no fix yet exists.
There was only a very short time span for the OpenSSL weak key flaw … it wasn’t hard at all to fix, as the flaw was caused by initialising some variables that shouldn’t have been. As soon as it was identified, it was fixed.
News flash, nitwit: NO OS IS SECURE, unless no one ever uses it and/or it is not connected to the Interwebs.
Have you ever installed Linux? Hello? Any updates needed after installation?
Jagbag.
Yes. Lots of them.
This is typical marketing. I want to observe a conclusive test. A fully patched version of the OS, then test. Otherwise, this is a waste of keystrokes, and bandwidth.
Do we ever see a conclusive test? Security Vendors always over exaggerate and microsoft always defend themselves. Its business!
Current versions of the big name OS fully patched.
Granted, to remove the chance of target bias, it’d be interesting to have the competitors hit each of the machines with the lovelies they braught. It would show what vulnerability was present in all platforms or what mitigated it.
I hope Microsoft includes a really good Security Essentials with all Windows versions from now on, so these smoke and mirror companies disappear once and for all.
Edited 2009-11-10 16:01 UTC
Why fully patched? not everyone has the luxury of a internet connection. How about as it comes out of the box or installed OEM on the PC
I believe a computer without internet access is useless nowadays.
I have to agree. Even as a huge linux user & fan i have to say linux is the worst for this. You can do very little in linux without an internet connection. Unlike install applications in windows, having the file for an application does not always work due to dependencies.
Although you can see how the trend of the way all platforms are being developed has gone towards the assumption that all computer user have access to the internet.
The pwn2own contest is a good independent security benchmark. I highly doubt if there could be a conclusive test though, because OS development is quite dynamic.
Typo in title…”Sopohs”???
Run windows in LUA+SRP mode folks. This is getting so tiring. I wish people who call themselves security experts knew the first thing about security.
“able to implement SRP on a VISTA PREMIUM (as explained here). There is no way to use any snap-in from Microsoft, as they have decided it was not for family members, but only for enterprise world”
http://www.wilderssecurity.com/showthread.php?t=232857
So, for the majority of versions which do not make LUA/SRP easy…?
On the Home versions of Windows Vista and Windows 7, the interface for SRP is the Parental Controls control panel (or the underlying API). Through this interface, you may restrict which applications may run (and more).
http://msdn.microsoft.com/en-us/library/ms711710(VS.85).aspx
http://msdn.microsoft.com/en-us/library/ms711654(VS.85).aspx
The windows guy said:
(my bold)
That’s simply not true. I’ve been using Linux exclusively both at work and at home (at least 10 hours a day in total). I’ve never installed an antivirus and I haven’t had any virus at all.
Edit: I forgot to mention I’ve been using Linux for five years now (and counting).
Edited 2009-11-10 14:33 UTC
As much as I would like to agree, having a false sence of security because we run linux is dangerous. Yes, there might not be any (real) virus for linux out there, but I still don’t want to be a vector of transmission by giving infected files to other computers.
Of course we wont have any threat if we us something nobody else uses, because, well, nobidy care! Now that allows me to surf the web and laugh at attempts to highjack my IE or even Safari, but that does not mean that my 3 years old unpatched firefox is more secure then the sandboxed,firewalled,antivirused IE 8…
Often, when advocating linux, I ear people saying that it is more secure and does not need antivirus. This is a dangerous idea of false security.
Firstly, antivirus isn’t security. Antivirus is trying to detect and remove a security breach after it has already compromised your system.
Secondly, the correct method of installing software on Linux is via the package manager. Package managers and the associated online repositories allow for a system where any piece of software can be audited and verified by any person on the planet. Anyone at all, not just the person who wrote the software. If everyone on the planet can see what is in a piece of software BEFORE it gets to end users, this makes it very difficult indeed to hide malware within that software.
Finally, one should examine the record. The record is AFAIK impeccable. AFAIK (and no-one has yet been able to contradict this) … there has never been an end-user’s system compromised with malware via installing open source software from package managers.
PS: On Linux, all programs by default run as a normal user. Running firefox on Linux means running it as a normal user, and hence it has no ability at all to modify or create system files or directories. All programs run as a normal user on Linux are effectively sandboxed.
Edited 2009-11-10 23:07 UTC
The “package manager and associated online repositories” doesn’t work with commercial/proprietary software, where you don’t have the source code. The best that an auditor can do in that case is GUESS whether the software contains malware or not; for example, an application may only reveal itself as malware under timed conditions (only destroying your machine or turning it into a zombie after a period of time). And, since there is an unquestionable need for commercial/proprietary software, you don’t have a solution.
Edited 2009-11-11 01:14 UTC
When package managers (on an end users system) are enabled to use an additional repository which holds binary-only software, then it is true that for that small set of packages the end users have no ability to audit them. They could potentially contain malware.
This is the risk one takes when one adds repositories for closed-source applications.
This is the PRECISE reason why such repositories are not enabled by default on most distributions.
You add the repository at your own risk.
My advice would be to refrain from ading such a repository until many thousands of expert users had had a chance to trial the applications. A few months after first release might be enough time. If there was any malware, it should have shown up by then.
Mind you, if a software supplier did set up a closed-source repository, and an application therein did contain malware, and end users did end up with malware as a result … that story would be all over the net in days. You wouldn’t hear the end of it. Windows fans would be jumping with glee, Linux users would be livid, and the site would be blacklisted (as a critical security update) almost immediately. You wouldn’t have time to blink.
The fact that this has never actually happened also nicely illustrates the security of package managers and repositories as a distribution mechanism, even when it comes to closed-source applications.
Keep going with these posts, you are doing a very good job so far of highlighting the fact that this repository/package manager system for distribution of Linux software is vastly superior to anything for Windows.
Satan, you YET don’t have to use an AV because Linux is only 01 percent of the desktop market. There is no point in f***ing just one percent of the desktop market when you can target around 90 percent. If the desktop share was very simmilar, I think you might be using an AV permanently.
Back to the subject of the news:
And there are guys over there that say “Windows is insecure” and right after that states “I don’t use Windows”. How ignorant can you be to make a statement like that? I’ve seen the same statement from a professor at Carnegie Mellon on the IT Security classes. The worst about this is that a person that gives classes about something usually dictates the culture over it and that is a serious thing.
But we all are missing the point, here. I too agree that security companies spread FUD about a lot of things based on user (lack) of knowledge.
Oh, do you see how things are? Not only the companies do that, anyone misinformed about something can state something “serious”. It all depends on WHO says. And if this guy is a somewhat “misinformed” person on a somewhat big company, the information will spread. And eventually get a response and generate some discussions around the web…
So, have fun!
I am not convinced that this is a true reflection of Linux usage. Many people use both windows and linux on a day to day basis, but would probably be considered windows users.
While i have no doubt in my mind that AS linux increases its market share viruses will crop up, it is just not a simple & easy to create a virus to infect Linux as it is to create only that will infect windows.
It’s that devil-may-care attitude that got him kicked out of heaven.
Actually, Linux is far more prevalent than 1%, even if we look only at the desktop market.
Linux reportedly has 32% of the netbook market, for example:
http://blogs.computerworld.com/15068/where_is_the_linux_desktop_goi…
http://www.computerworld.com/s/article/9140343/Linux_s_share_of_net…
I always drop ClamAV on a box I build. No reason not to help protect the Windows machines one may be sharing files with.
Not using an AV somehow proves that you don’t get any virus? Weird. Not like a virus will show a message in the screen telling you “you are now infected”.
Exactly, I always wondered how people know they dont have viruses without “ever installing antivirus”. That goes for you Mac’ guys too!
Edited 2009-11-10 20:01 UTC
In fairness, there are usually indications of malware infections (unusual drive & network activity, suspicious processes, etc). And the people who run without anti-virus software can usually spot those signs on their own (or at least they think they can).
I always wonder how people know they dont have viruses after having installed antivirus. That goes for you Windows guys! :-0
Maybe I’m naive, but I trust F-secure. Its proven itself to be mom and dad proof for over four years now! Plus, during the installation on older machines, it only enables the components that it thinks the machine can handle. No crippling.
I’m really happy that it will be Microsoft who buries all the antivirus vendor vermin with their Microsoft Security Essentials suite. It’s very good and free.
I used Windows 7 for almost 6 months now. And you know what? I’ve seen no virus, no worm, niente, nada.
Sophos is pushing “the panic”. They ought to. After all, if everybody is thinking there aren’t so much malware treats any more, why are they going to buy Sophos antivirus?
Sophos sells AV so I’d expect the marketing message “win7 needs AV; and it should be our AV you use”.
At the same time, I also don’t think six months uninfected somehow disproves the need for AV. I’ve been running winXP for years without a virus hit; does that mean winXP does not need protective measures too?
You’re kidding right? You. One person has not been infected with any viruses with windows 7. Therefore, it is impossible to get a virus on windows 7? Is that really the conclusion you are drawing?
Uhm… I suppose most viruses are hoaxes because you haven’t been infected with them. So how many viruses aren’t hoaxes? Just the ones you’ve been infected with?
There was once upon a time that smart users didn’t need anti-virus software. That was before malware writers stepped up their game and found silent vulnerabilities in microsoft products that required no explicit user interaction.
I think there is no doubt that Vista and now Windows 7 is more secure than XP, however, this is not saying much an un-patched XP could be infected by just plugging into the internet – it almost infects its self. OK the situation improved with the service packs, however, this is a very insecure OS.
Certainly with Vista we didn’t get the mass infections like Blaster and it seems reasonable to believe that Windows 7 will be similar to Vista, (although I am like many concerned about the changes in UAC and wonder if these changes will survive the first service pack.) However, I’m sure an updated virus checker is needed. I see infected Vista machines every day and it will be the case with Windows 7. I’d say here in Lesotho we have about 80% virus infection rate on XP PCs and a lower but very significant infection rate in Vista PCs. If you doubt these figures consider Windows without the internet (no updated AVs) and prevalent file sharing via flash drives.
The legacy features and desire by MS to maintain backwards compatibility means that this virus problem cannot be fixed. Improved a little but not fixed an AV is needed and God help us in the third, non internet world.
I note in the MS blog it was suggested that all platforms need an AV well Mmm. I don’t use one on my Linux Box, but I can see that it may become more necessary, if I look at Ubuntu forums I can see that Ubuntu is beginning to appeal to the less technical and soon the folk who might well click on a see_naked_ladies file and enter the root password when asked. Obviously the problem will never reach or even come close to windows levels but let’s not be too complacent no OS is invulnerable and some users need protecting from themselves.
Edited 2009-11-10 15:50 UTC
I’m a programmer. I used to write some quick’n small malware just out of fun. I have friends in my country which are working for big antivirus solutions – RAV – now MSSE – ex GECAD, now Microsoft, -BITDEFENDER. I even have friends in the underworld. And everybody agrees to that: writing worms targetting Windows is like trying to target FreeBSD’s jail. Not undoable, but hard like hell.
Back in the happy days of Windows XP SP1, there were a breeze writing worms which propagate like plague on windows machines. But with the new security models, writing malware is much, much harder.
I mean, I remember the first opensource windows worm: rxbot. And the first open source windows/linux worm: agobot. I happily contributed to them and modified the sources. It was easy as hell to hack a windows box. But not anymore.
Generally, if you want to break a box, you need to use a buffer overflow exploit. You write crafted code to some ports on a machine, and boom, you’re in. Not anymore. Not only that exploits are getting patched really soon, but even if you discover a 0day ecploit, you can’t really use it. You need to bypass the firewall (ports aren’t anymore unprotected), and you end up taking charge of an application running in user mode. So you need to bypass the UAC, which is pretty complicated.
I don’t say it’s undoable, but the security is very hardened and it will be very hard, and it will take thousands of manpower hours do do something which will work.
It’s probably easier to craft some trojan with “social” abilities and people will download it and use it without suspecting anything until it’s too late.
That doesn’t have anything to do with the software platform. Any software platform is vulnerable in that respect.
Yeah. I get the feeling that it was these Trojans that Sophos tested with. The OS doesn’t really do much against those except via the Malicious Software Removal Tool, that only targets the absolutely most ‘popular’ malware.
It is pretty much impossible to keep trojan programs out because they don’t violate the security model of the OS.
Yes, but wouldn’t it be better to protect yourself and your company from these attacks as much as possible?
More or less true.
There is however one desktop system available that allows one to hold to a policy of not downloading any software except via auditable channels (package managers). To hold to such a policy, all that a user has to do is refrain from supplying his/her password anywhere except for the login screen and the package manager (which is the expected norm anyway).
If one simply sticks to such a policy, then no amount of cleverness in trojans with social abilities will be able to compromise the system.
So Windows 7 is immune to 2/10 popular viruses even when the user double-clicks the executable and then hits allow?
I think that’s pretty good!
That’s the whole point. Unfortunately so far in the comments, like usual, the discussion has gone into OS politics.
These guys intentionally executed viruses that 8/10 didn’t need permission elevation (functioned at local users level) and Windows rightfully allowed them to execute. I mean we’re techies here, we understand there’s no magic involved in preventing something like this. However, Sophos can (ab)use it as a marketing tool when selling AV to normal users.
Just curious here … how do you imagine that Windows verifies that it was a valid user who caused the executable to be run and then caused a “click” to be registered on the allow button?
It seems to me that Windows doesn’t verify that at all. No entry of a valid password is required.
In addition, apparently Windows 7 automatically elevates the permission level of several Windows utilities without even a UAC prompt.
Edited 2009-11-11 00:30 UTC
If he doesn’t have malicious software running to begin with, who else but the user could possibly issue the ‘click’ that starts up a trojan?
A script running in the web browser, outlook or the IM client, sent to the machine from some random on the net.
An autostart script on a USB stick that was picked up when that stick was in another machine somewhwere (say, at the library, or at the photo print shop, or at the kids school).
Any hostile person who has unattended physical access to the machine for a few moments while it is logged on.
Edited 2009-11-11 22:26 UTC
All operating systems are vulnerable to remote code execution bugs. In fact, the most recent serious vulnerability of this nature was a bug in the Java browser plugin and it affected all platforms.
Autorun on a USB stick was a brain dead idea and has finally been removed in Windows 7.
All operating systems are vulnerable to this.
The point is that the many many thousands of malware payloads that could use such an exploit are virtually all Windows executables.
Thank goodness. Why did it take Microsoft years to do that?
Nope. On secure systems, such a hostile person would require knowledge of a password in order to be able to elevate priveledges. On Windows 7, all that the same hostile person would have to do is click on ‘allow’.
Not true. There are several attacks one could perform on a logged on system to gain full privilege later on by fooling the user into giving up his password. Depending on path settings, or specifics of the environment, you can create a script/program that masquerades as a legitimate higher privileged application and takes control next time the user performs that activity.
Maybe there are some mitigations already in the Linux environment that I don’t know about. Do the DEs in some way protect shortcuts to important apps from tampering (e.g. the launcher icon for the package manager)? Is the path in the shell always ordered so that privileged directories come before unprivileged ones? Is there no way for a malicious program to reorder the path once it is established, or launch a sub-shell later on with a reordered path?
That’s irrelevant. All it takes is one. Over many years of using different operating systems, the only machine I’ve ever had taken over remotely without any action on my part whatsoever was a Red Hat 9 box. The attacker had tampered with the PAM configuration, replaced /bin/login, and had about a dozen new accounts running IRC bots. I found evidence of one of those little script kiddie rootkit packages that you can download just about anywhere. This is not an attempt to damn Linux. The whole event was completely my fault for not keeping the system “up2date”. The point is that hostile code exists for all platforms.
Remote code execution and privilege escalation exploits are becoming increasingly rare across the board these days anyway.
I assume it has something to do with the behemoth size of the company.
Given physical access to any machine without encrypted volumes, it is trivial for anyone with a moderate level of skill to install whatever they want on it.